Saturday, December 24, 2011

Where my Scala makes my Zipper Optional

Houdy. 

As told in previous posts, I have read Learn You a Haskell for a great good. The book content, in addition to be humorous and filled with the author drawn-by-hand pictures, is worth while reading for people - who like me - embraced functional programming a few months ago. 
The reading took some time, and in some sense started changing my way of coding. Do not misunderstand me. As a Java developer by day I have no other choice than coding in a Object Oriented approach - as far as Java permits me - and do not specifically reject this approach. I simply found some answers to technical questions and more personal way of looking at code. 

Embracing a new paradigm leverages your ability to constantly make your approach of coding evolves. It is a matter of personal opinion, but I want my own code to evolve, always. Learn you a Haskell was a first shot, and at this very moment Programming Haskell and the Craft of functional programming lays beside me on the desktop while my pdf ordered copy of Real World Haskell patiently wait on my SSD USB key. 

For those who know me, I love both Clojure and Scala, and this is not just an excuse to buy fancy geeky t shirts, I mean it . Practicing so long Haskell, while not doing Scala has been a burden. At some point I had to come back to Scala doing a simple exercise of my own in order to restart coding in the language. I needed a small kata, nothing specific in particular in order to get in Scala again, even a single class. Unfortunately, without current position nor pet project to tease my mind, the ideas were not easy to find. I found amusing considering the last chapter of Learn you a Haskell and started coding a very small zipper. 
Today post is a just a very small rambling on coming back to Scala and trying to feel the code differently.  

In 1997 Gerard Huet from INRIA (even me sometimes have french references), published an interesting article on the Zipper, as a simple approach to tree navigation and modification. Instead of considering a tree as a global context, one can find suitable to focus on a node to work on, to navigate from, or to change a local information to, like some data bound to the node. And this is it . 
A zipper is mainly a magnifying glass focusing a specific branch of a piece of tree bound to a node. In order for us to move locally up, right, left from this branch after manipulating it, we just need to remember the little world around us and the path taken to get to this position (frequently the path issued from the root).
Supposedly having to modify some data bound to nodes - no so far one from another-, having the ability to move from node to node, remembering where we come from is a real bless. No need to seek out for each node from the root each time we want to modify it: let's fire a bunch of moving commands. Using Gerard Huet own words, going down and up through the tree is like zipping and unzipping a piece of cloth, aka the name.
As far as trees are concerned , knowing the world around our branch means knowing the owning node and the other tree branches.

Everything starts with a tree definition. 
After 3 months of Haskell, I could no see a tree definition as class template defining object grapes but as a union of types (You must trust me on this assertion :)). Stephan Boyer post on this point of view is quite explicit. A tree is either an empty leaf or a node. In haskell it is defined as :

data Tree a = Empty | Node a (Tree a) (Tree a) deriving (Show)

which becomes in Scala

package com.promindis.data

sealed trait Tree[+A]
object Empty extends Tree[Nothing]  {override def toString = "Empty"}
final case class Node[+A](value: A, left: Tree[A], right: Tree[A]) extends Tree[A]
The scala definition concedes two lines to the Haskell one (no big deal frankly, a a certain point, the line battle does not mean anything, generally battles around languages do  not mean a lot). 

Why making things complex? A tree definition can be hosted by a simple trait. The parameterized A type define the type of the data bound a tree node. A Tree instance can be Empty or a Node instance.

The Tree value constructors are defined to be co-variant. As so, a node binding a String value is compatible with a node binding a AnyRef value. A leaf by definition must be compatible with anything. In Scala all types have a common lower bound type impersonated by the Nothing type. Then the Empty  object extending Tree[Nothing] will be compatible with Tree instances holding data of any type. Considering the uniqueness of a leaf, I found more natural making an object of it. 

I call the little world around a tree node a Spot (and not a breadcrumb like Miran Lipovaca, I leave to him that cute definition :)). Spot definitions gathered in lists will make a complete path, noticing for each step whether we came from the right or the left. Therefore our  type definition becomes:

package com.promindis.data

sealed trait Spot[+A]
final case class LeftSpot[+A](value: A, right: Tree[A]) extends Spot[A]
final case class RightSpot[+A](value: A, left: Tree[A]) extends Spot[A]
Spot type is defined to be co-variant versus its parameterized type, the meaning attached to this co-variance reflecting the one attached to the Tree type co-variance. So far so good (I told you there would be little code).

 At that point we are ready to play with zippers. What is a zipper ? A friendly (immutable of course) object encompassing the current working Tree node, and a path we are coming from, so:

final case class Zipper[+A](tree: Tree[A], spots: List[Spot[A]])

As explained before, the list of spots makes a path.
By default the tree and spots fields are final (val in scala), so immutable. Good.

Focusing, on the up, right, left and update functions in our tree, we can simply create a new immutable Zipper as a result of invoking each operation. Or can we ? What would happen if by any mean we were rambling too deep in our tree structure ? No doubt that something bad would occur as no one can go left or right from a leaf for example.
This is a potential error case.
Not watching our steps and going too far should lead us nowhere, but we should not suffer from nasty side effect like null pointers or invalid method calls. A nice Scala solution does exist, that would confine our Zipper in a "cotton" context, protecting us from side effects .

We can take back our Zipper whenever we want from the context, but the "cotton" context will not harm us. The Option type is our solution. Our Zipper is simply optional. It can look like Some(Zipper(tree, list)) when existing or None which represent no Zipper. As so, the left() method definition becomes:

 def left(): Option[Zipper[A]]

The right and up methods will return too Option instances. 

Frankly, at that point, we are nearly done. Of course I am going to use a test driven approach. But wait a minute. In order to ramble through my tree, check my values , will I have to extract my zipper at each step?
That can become particularly cumbersome ! Scala provides us with the same tools as with Map or List. One can extract the content of an Option using the tools used for collection iteration, the <- sugar form. So, starting from the following structure

  val leftChild = Node(2,Node(3, Empty, Empty),Node(4, Empty, Empty))

  val rightChild = Node(5, Node(6, Empty, Empty), Empty)

  val updatedChild = Node(7, Node(6, Empty, Empty), Empty)

  val tree = Node(1, leftChild, rightChild)


we can test the zipper expression after going left and up this way :

 def e2() = {for (
      left <- Zipper(tree, List()).left();
      up <- left.up() )
    yield up}.should(beSome(Zipper(tree, List())))

where e2 defines a Specs2 match result. Going to the left, produces an Option context. 

Applying <- to Zipper(tree, List()).left(), binds the extracted value to the local left variable. We then reuse it on the next line in order to go up. The result is again bound into a protecting Option context. At this point Specs2 provides you with the suitable DSL to extract, then challenge the value.

You may have guessed it, the Option type is a Monad, offering you a context to embed your values and manipulate them protecting your code from some side effect. Quoting Miran Lipovaca, the list comprehension for and its <- form provide us with an easy to read syntax gluing together monadic values in sequence. In our case the for comprehension is very similar to the do notation in Haskell.
Of course Monads are not just programmatic applicable design patterns and everyone can learn from category theory their more profound meaning, but you can view them as design patterns very suitable in solving programming problems. The Option monad offers you a possible failure context wrapping around optional values, while a List Monad for example can be assumed as embedding non deterministic values. 
At some point one have to start working with this notions in order to progress and I think possible to start from the application level before leveraging to the more general notions in the category theory domain. The application level offers the advantage of the every day practice. 

So, now before I forget, the testing code I wrote:


package com.promindis.data

import org.specs2.Specification

final class ZipperSpecification extends Specification{def is =

  "Zipper Specification"                      ^
                                              p^
    "Left method in Zipper should"            ^
    "go left from current scope"              !e1^
    "return to root applying up to go Left"   !e2^
                                              p^
    "Right method in Zipper should"           ^
    "go right from root"                      !e3^
    "return to root applying up to go right"  !e4^
                                              p^
    "Navigation should"                       ^
    "prevent me from going too far"           !e5^
                                              p^
    "Update should"                           ^
    "allow me to update right node"           !e6

  val leftChild = Node(2,Node(3, Empty, Empty),Node(4, Empty, Empty))

  val rightChild = Node(5, Node(6, Empty, Empty), Empty)

  val updatedChild = Node(7, Node(6, Empty, Empty), Empty)

  val tree = Node(1, leftChild, rightChild)

  def e1()  =
    Zipper(tree, List()).left()
      .should(beSome(Zipper(leftChild, List(LeftSpot(1, rightChild)))))

  def e2() = {for (
      left <- Zipper(tree, List()).left();
      up <- left.up() )
    yield up}.should(beSome(Zipper(tree, List())))

  def e3() =
    Zipper(tree, List()).right()
      .should(beSome(Zipper(rightChild, List(RightSpot(1, leftChild)))))

  def e4() = {for (
      right <- Zipper(tree, List()).right();
      up <- right.up() )
    yield up}.should(beSome(Zipper(tree, List())))

  def e5 = { for (
      right <- Zipper(tree, List()).right();
      deeperRight <- right.right();
      tooFar <- deeperRight.right()
    ) yield tooFar}.should(beNone)

  def e6 = {for(
    right <- Zipper(tree, List()).right()
  ) yield right.updated(_ => 7)}
    .should(beSome(Zipper(updatedChild, List(RightSpot(1, leftChild)))))
}

leading to the implementation:
package com.promindis.data

final case class Zipper[+A](tree: Tree[A], spots: List[Spot[A]]) {

  def updated[B >: A](f: A => B): Zipper[B] = {
    tree match {
      case Node(value, left, right) => 
        Zipper(Node(f(value), left, right), spots)
      case Empty => this
    }
  }

  def left(): Option[Zipper[A]] = {
    tree match {
      case Node(value, left, right) => 
        Some(Zipper[A](left, LeftSpot(value, right)::spots))
      case Empty => None
    }
  }

  def right(): Option[Zipper[A]] = {
    tree match {
      case Node(value, left, right) => 
        Some(Zipper[A](right, RightSpot(value, left)::spots))
      case Empty => None
    }
  }

  def up(): Option[Zipper[A]] = {
    spots match {
      case LeftSpot(value, right)::xs => 
 Some(Zipper(Node(value, tree, right), xs))
      case RightSpot(value, left)::xs =>  
        Some(Zipper(Node(value, left, tree), xs))
      case Nil => None
    }
  }
}


Progressively coming back to Scala, more determined than ever you walked by my side creating a small Zipper finally making use of the Option Monad. Not bad for a Christmas day. Merry Christmas to all. Be seeing you !!! :)

Sunday, December 18, 2011

Where we apply destructuring, pre check and decorate data

Hello again.

I just ended reading Learn you a Haskell for a great good and trying to find some example to set on the blog in order to explore (applicative)functors an monads on the blog. As these things take time, the blog is still empty.
But this kind colleague of mine always insists in the way I should produce thoughts on a more regular base even when they are just remainders or basics. So Adrian, I fulfill your request.

Having switched back to my SICP lecture (stuck on chapter 2) , I did the data symbol part (almost, as exercise 2.58 is still waiting for me). For the first time I bumped into a wall with Scheme, as I don't know the language very well. All things being equal I came back to Clojure, as trying to practice the book with both languages.
I found then myself using some little Clojure forms I was not expecting there, like pattern matching, meta data and pre condition testing. Abelson and Sussman present us a small differentiation program aiming to derive so simple equations. Of course the program is extremely limited, the purpose being to make the student work with symbol data. A typical example of differentiation is :

sicp.data-symbol=> (deriv '(* x y (+ x 3)) 'x)
(+ (* x y) (* y (+ x 3)))

where the first parameter of the deriv function is the expression and the second, the free parameter used in the differentiation process
Although there is room here for macro kata, I adopted the authors approach, not feeling completely at ease with macros.
 The interesting stuff here is that we start with a "symbolic" expression very Clojure idiomatic and get as a result too a symbolic expression. At some point in the book, the reader must extend the differentiation program in order to allow for differentiation of multiplications and additions accepting both multiple parameters.

In my humble opinion the exercise should be achieved using the ideas from the previous paragraphs specifically when dealing with stratified design, ergo the notion that a system should be implemented versus a sequence of layered languages. Each level is built up upon a lower level combining the latter functions so providing a new mean of abstraction. I found my Scheme version violating this rule.

Coming back to Clojure, how are we going to express the symbolic data ? Well, we will use the quote special form. That one simply prevents its arguments from being evaluated. So you can try the following in the REPL:

sicp.data-symbol=> 'a
a
sicp.data-symbol=> '(1 2 3)
(1 2 3)

A symbol is equal to himself, so in the first example, 'a will always mean 'a. The second example exposes the strengh of the quote (') symbol as a powerful way to construct self explicitly lists avoiding the cluttering noise of a list keyword. Following the writers indication I produced a first set of basic functions allowing me to challenge my symbolic expressions. Expressions will be idiomatically expressed as list like structure:

 
'(* x y (+ x 3))

Checking the nature per se of the expressions becomes then very easy, as we will check for the first symbolic item in a list:

(defn variable? [expression]
  (symbol? expression))

(defn sum? [expression]
  (= '+ (first expression)))

(defn product? [expression]
  (= '* (first expression)))

(defn exponentiation? [expression]
  (= '** (first expression)))

(defn same-variable? [expression variable]
  (and (variable? expression) (= expression variable)))

challenged using :

(deftest sum?-should-recognize-input-expression
  (is (= true (sum? '(+ x 3))))
  (is (not= true (sum? '(* x 3)))))

(deftest product?-should-recognize-input-expression
   (is (= true (product? '(* x 3))))
   (is (not= true (product? '(+ x 3)))))

(deftest exponentiation?-should-identify-operation
  (is (= true (exponentiation? '( ** x 3))))
  (is (= false (exponentiation? '( * x 3)))))

Of course I did not challenge Clojure native number? and symbol? forms. A lot of the job was done without using tests but walking side by side with the authors' wishful thinking approach, but I I will provide info when I produced it.

The next level of abstraction includes constructors and selectors, respectively creating the operation expressions, than selecting operands on them. For example in order to create a sum one need a make-sum function as constructor. The selectors aim to extract the addend - the first operand - of the sum expression and the augend, the second operand. Working on the sum, we expect our constructor and selectors to handle all the dirty process of manipulating multiple arguments. The intent is to keep the piggy stuff at a lower level so the main function in charge of the main process might be kept clean and unaware of the lower stuff like structure of a sum , products etc...
Of course in the SICP book, the main function has already been written by wishful thinking. In essence our sum constructor should comply to something like this:

(deftest make-sum-with-simple-arguments-should-produce-expression
  (is (= '(+ x 3) (make-sum 'x 3))))

(deftest make-sum-with-simple-arguments-should-produce-expression
  (is (= '(+ x y z) (make-sum '(x y z)))))

(deftest make-sum-with-one-zero-argument-should-produce-symbol
  (is (= 'x (make-sum 'x 0))))

(deftest make-sum-with-two-zero-argument-should-produce-operation-symbol
  (is (= '(+ x y) (make-sum '(x 0 y 0)))))

(deftest make-sum-with-two-numbers-should-produce-number
  (is (= 3 (make-sum '(1 2)))))

We introduced, like in the book, smooth reduction operations allowing to get rid of the identity operands when possible. The result expression are complex enough so we can try to reduce a little bit of that complexity.
The quality of solution I provided is debatable and as usual I am opened to critics. Not at ease with wishful thinking, I tempted to guard my functions with small contracts on my input functions (stack overflow being very close) and explored function variable arity in order to satisfy both my main function invocations and my selectors'invocations:

(defn make-sum
  ([x y] (make-sum [x y]))
  ([[_ :as expression]]
    (cond
      (= 1 (count expression)) (first expression)
      (all-numbers? expression) (apply + expression)
      (:simplified (meta expression)) (cons '+ expression)
      :else (make-sum
              (with-meta
                (filter (fn [x] (not= 0 x)) expression)
                {:simplified true})))))

(defn addend [expression]
  {:pre [(>= (count expression) 3)]}
  (second expression))

(defn augend [[op _ & cdr :as expression]]
  {:pre [(>= (count expression) 3)]}
  (make-sum cdr))

The SICP book strongly suggests to make of a '(+ x y z) expression a '(+ x (+ y z)) expression while looking for augend (so far, the interpretation is mine).
 So now we are talking.

In less than 20 lines I used a lot Clojure idiomatic stuff. As told earlier, my main function already exists (check at the end of the blog) and expects my make-sum function to expose a two variable signature. While extracting the augend parameter from my sum expression I don't want to clutter the code of the augend function with expression re-composition.
Composing a new sum from an existing one is the purpose of the make-sum constructor. I can isolate the sub set of the list I want my constructor to use in order to create a new sum expression. So my make-sum function will expose two function signatures, one with two parameters, and a second with a sequence.

Isolating the rest of the expression in order to crate a new sum expression can become annoying, specifically when mangling a let form sub level with combinations of first and rest functions applied to the input expression. Then comes destructuring. As noted by one of the readers (see comments), destructuring slightly differs from pattern matching  (one  may find interesting discussions here and there).
Let's say on a first approximation that destructuring allows to bind inner elements of an input sequence or map to input function variables. As so, appying :

[op _ & cdr :as expression]

will
  • bind op to the first parameter
  • ignore the second parameter as idiomatically expressed with the _ wild card 
  • and using the & separator will isolate all the remaining elements in a sequenced referred by the cdr expression
The :as keyword allows for the aliasing of the complete input collection using the expression parameter name.

My intent , aliasing the whole input expression, was to provide meat to a pre condition checking (yes like in design by contract). Pre-condition checking are a bless specifically when they stop you before your blindness fires a stack overflow :).
 Back to the make-sum function variable arity, one can see that I delegate all the work to the version accepting sequences as parameters. I could have not used destructuring into the second signature. I found it an idiomatic way to express that the function was specifically expecting a sequence.

As the tests showed earlier, I had to take into account various scenarii like the identity element for the sum, that can lead to a single element expression, the fact that all the list operands are pure numbers etc...

Filtering identity elements induces a recursion step after the filter step. We should memoize some ...state (?) in order to not indefinitely filter zeros from the input expression.But state is bad, we all know it. Decorating the filtered list with a meta information can be a smooth way to proceed. Once the immutable filtered sequence created, we do not alter its content or use a third method signature. We attach to the data a meta information:

 (with-meta
   (filter (fn [x] (not= 0 x)) expression)
   {:simplified true})

informing the function that no more recursion is necessary:

(:simplified (meta expression)) (cons '+ expression)

and the construction of the sum expression can be achieved.
The set of operations is similar when it comes to the product expression constructors and selectors.

 The whole code is given by:

(defn variable? [expression]
  (symbol? expression))

(defn sum? [expression]
  (= '+ (first expression)))

(defn product? [expression]
  (= '* (first expression)))

(defn exponentiation? [expression]
  (= '** (first expression)))


(defn same-variable? [expression variable]
  (and (variable? expression) (= expression variable)))

(defn all-numbers? [in-list]
  (letfn [
           (iter [still-number remaining]
             (cond
               (not still-number) still-number
               (empty? remaining) true
               :else (recur
                       (and still-number (number? (first remaining)))
                       (rest remaining))))]
    (iter true in-list)))

(defn make-sum
  ([x y] (make-sum [x y]))
  ([[_ :as expression]]
    (cond
      (= 1 (count expression)) (first expression)
      (all-numbers? expression) (apply + expression)
      (:simplified (meta expression)) (cons '+ expression)
      :else (make-sum
              (with-meta
                (filter (fn [x] (not= 0 x)) expression)
                {:simplified true})))))

(defn addend [expression]
  {:pre [(>= (count expression) 3)]}
  (second expression))

(defn augend [[op _ & cdr :as expression]]
  {:pre [(>= (count expression) 3)]}
  (make-sum cdr))

(defn make-product
  ([x y] (make-product [x y]))
  ([[_ :as expression]]
    (cond
      (= 1 (count expression)) (first expression)
      (> (count (filter (fn [x] (= 0 x)) expression)) 0) 0
      (all-numbers? expression) (apply * expression)
      (:simplified (meta expression)) (cons '* expression)
      :else (recur
              (with-meta
                (filter (fn [x] (not (= 1 x))) expression)
                {:simplified true})))))

(defn multiplier [expression]
  {:pre [(>= (count expression) 3)]}
  (second expression))

(defn multiplicand [[op _ & cdr :as expression]]
  {:pre [(>= (count expression) 3)]}
  (make-product cdr))

(defn make-exponentiation [base exponent]
  (cond
    (= 0 exponent) 1
    (= 1 exponent) base
    (and (number? base) (= 1 base)) 1
    (and (number? base) (= 0 base)) 0
    :else (list '** base exponent)))

(defn base [from-exponentiation]
  {:pre [(= 3 (count from-exponentiation))]}
  (second from-exponentiation))

(defn exponent [from-exponentiation]
  {:pre [(= 3 (count from-exponentiation))]}
  (second (rest from-exponentiation)))


(defn deriv [expression variable]
  (println expression " derived by " variable)
  (cond
    (number? expression)
    0
    (variable? expression)
    (if (same-variable? expression variable) 1 0)
    (sum? expression)
    (make-sum
      (deriv (addend expression) variable)
      (deriv (augend expression) variable))
    (product? expression)
    (make-sum
      (make-product
        (multiplier expression)
        (deriv (multiplicand expression) variable))
      (make-product
        (deriv (multiplier expression) variable)
        (multiplicand expression)))
    (exponentiation? expression)
    (do
      (make-product
        (make-product
          (exponent expression)
          (make-exponentiation (base expression) (dec (exponent expression))))
        (deriv (base expression) variable)))
    :else (println "Invalid expression")))

I am working on a Github account, so next time I will give you a link I hope :)

We have been using destructuring, meta data, and function variable arity, all Clojure's idiomatic forms, in one small program, not so unrealistic (shoot me if you never had to write a small interpreter in your real coder's life. If not, you have been lucky)

 PS: For Schemers , why did not I find my solution satisfactory ? Because my first shot is:

(define (make-sum x y) 
  (cond ((=number? x 0) y)
        ((=number? y 0) x)
        ((and (number? x) (number? y)) (+ x y))
        ( else (list '+ x y))))

(define (augend sum-expr) 
  (let ((rest (cddr sum-expr)))
    (if (= 1 (length rest))
        (car rest)
        (cons '+ rest))))

From my own point of view, the augend function should not build a new sum, the make-sum function taking that in charge. But I will find solution using the '.' notation in input parameters.

 Be seeing you !!! :)

Saturday, December 10, 2011

A day at Clojurex

Hello again and sorry for not coming back on a more regular base.

I guess next time I will have to pull some of my exercises from SICP studies or Euler project exploration in order to expose new code. Cross my heart I will try to produce one more code block before the end of the year.

So what happened ?
Three nice things happened.

I am reading Learn you a Haskell for a great good in order to progress in Scala while exploring type system through the nearly perfect Haskell language.
I am doing each exercise of the sicp both in Scheme and Clojure.

As a matter of fact these things take time, so we have a small blog black out issue. In order to have a cool break I decided to try a third exploration like going to a Skillsmater exchange on Clojure. Maybe this was the best idea of the year.
The Clojurex event happened on December 1st. Before having forgotten everything let's talk about that No code. Just talk (not too long, I swear)

London's calling

I went twice to London in forty years, once when I was twelve and another time for Clojurex. I won't wait so long as I registered for Scala days in April.
London has changed of course and from a more adult point of view I really felt like home. First of all, I found welcoming and smiling people everywhere. Undoubtedly, sharing a little of their lives two day was a joy, a hundred of times more pleasant than sharing the lives of french people.
This reinforced my wish to work over there.


At Skillsmatter

The skillsmatter staff is awesome. I cannot find any other words in order to describe the organisation and the welcoming.
They did manage to change the organisation of the event after Uncle Bob misadventure with custom officers and shame on me for hesitating one single second when this happened.
Special thanks to the kind lady who managed to catch a cab so I could get my Eurostar just an hour before departure: perfect timing.

I could not summarize all the locations where the attendees were coming from, but in essence, there was a real community spirit of curious open minded people, from older experienced Lispers to young fascinating skilled people, practicing Java, Scala and alternate languages.
My friend won't understand (:)) but I chose to remain quiet and listen the different conversations taking benefit from the everyday experience of every one. When I was near a group, people naturally included me as if I were a longtime friend. Thanks to all of them.

The presentations

Leaving about 3pm, I watched the morning presentations, unfortunately missing the afternoon brain storming and Uncle Bob Skype conference (yes he finally made it via internet). I will not detail the presentations advocating more the reader to both check the matching podcasts and then check each library/framework web site in order to try the stuff. My involuntary freedom in January will give me some time to try each of them and feedback when possible.

Knowing my reluctance to use frameworks in favor of patterns and algorithms application, why the hell did I attend framework presentations? Because the creators were there in order to discuss the purpose of tools, advocating more for better knowledge of Clojure and how to carefully depends on libraries without breaking language idioms. 
Obviously nobody attended the presentation in order to critic other languages or promote this or that technology like we often read or hear from some Java framework extremists (guess who am I talking about). The collective spirit was turned to better programming, paradigm adoption, and use of one's brain  rather than to framework propaganda and dumb assembly of bricks.
For that thanks again pals :)

Bruce Durling presented Incanter,a platform for statistical computing (including inter alia as committers David Edgar Liebke, Bradford Cross, Michael Fogus, Mark Fredrickson,...). The online presentation, available on line - like all the others - is very impressive as Bruce cleary demonstrated that developers committing into data driven application could find in Incanter a valuable asset, very respectful of Clojure idioms in its DSL forms and expressions target data sets manipulation. I personally was impressed by the ease of use of higher order operations like joining, filtering, grouping by columns and other similar facilities. And guess what, there is also a plotter :)

Robert Rees like some of us tried to get Clojure into a live environment having it stay there. He succeeded proceeding carefully, starting with  tasks which simplicity could easily reveal the strength of Clojure power, incrementally increasing the space of the language, baby steps by baby steps.   So Clojure can be used in production with no harm, on the contrary.

Another data flow oriented category of applications is the web applications. I gave up on pure web development years ago because of my ignorance of functional programming, as not being a genuine computer scientist. I was always frustrated by the complexity of setting up suitable patterns matching both my domain and business problems. From time to time the OOP model did not fit. Specifically when dealing only with flow of data to be handed, filtered and manipulated with basic business rules. 
Like Lift in Scala, Webnoir (written by Chris Granger)  and introduced here by John Stevenson allows you to rapidly develop websites in Clojure using a pure functional oriented approach. Looking at the presentation, I could not help thinking of two or three projects I have been working on years ago, and how much this non MVC approach would have answered myriads of problems we had to face.
I enjoy adopting the MVC approach, but from time to time, MVC does not fit. Functional programming does, ergo webnoir does too. I will try Webnoir as I will try Lift or Play! . And why not coming back to web application ? :)

As I started working with early versions of Swing years ago , Stathis Sideris presentation about Clarity, a new Clojure framework addressing some aspects of GUI programming, specifically shook me. It is possible indeed to (very) easily create Swing application from Clojure thanks to the Clarity layer. I foresee possibilities in this product for a good GUI testing environment. The fluency of the DSL Stathis developed could favor the emergence of very efficient and idiomatic testing API, like UISpec4J in Java. Exploring all the available forms interfacing Clojure with Java, like reification and protocols, Clarity allows for a declarative way of building up GUI (smartly proposing a stylesheet like functionality). No need to explore the code in order to understand the shape or the purpose of a pane or frame. The whole code layout clearly expresses the intents.

Being very quiet during the whole day was not a good strategy to find a freelance contract in UK :). I am sorry for that but frankly not going to Clojurex would have been a big mistake. I got richer in knowledge and had the opportunity to cross the path of a bunch of cool passionate guys. I earned some followers on Twitter and started following others my self, gaining access to a rich set of technical information since then.

If fate does not behave badly next year I will  come back to another Clojurex session, the early bird having already ordered a seat for 2012 December 6th.

Be seeing you !!! :)