Monday, January 9, 2012

New Happy Scala year sipping a lite Monad

Hello again and a very happy new year .

I would like to thank last year readers as I did not expect so much people visiting the blog. 
I hope the readings provided you with helpful information, and thanks again to the persons who kindly commented providing each time precious feedback information. 
In addition to health and good fortune, I wish you to explore all the possible domains you would like to explore and learn as many things as possible (even Spring if you truthfully want to explore it :)). 

Among all resolutions I promised myself to find a day job allowing for more regular practice of Scala and/or Clojure in order to forge the necessary base of a practice aiming to sustain my night rambling in functional programming, type theory, category.
The incoming year would be a nice year if these ramblings also drove me to explore a NoSQL database and read both Thompson's Type Theory and Pierce TAPL. So what are your programming wishes ? :) 

For this first blog of the year I would like to share my modest ramblings on "contextual" programming in Scala, experiments started as a continuation of my reading of Learn You a Haskell. This experience naturally lead me to Martin Odersky's article Generics of Higher Kind and Scalable Component Abstractions, then to Joshua Suereth writing Scala in Depth, specifically chapter 11 (Joshua if you ever read this modest blog entry, thank you for providing us with such an enlightened book). 

My first purpose is to mimic, in many different upcoming proof of concept, the Haskell way of separating the pure from the impure, aka separating the referential transparency of functions from side effects. Yes, we are talking about Monads
It took me time to start dealing with Monads because of all the negative fuss around the Category Theory world. Like many of you, my mathematical background is far away. I belong to that part of people who thinks that it will be a necessary move to awaken this part of my knowledge in order to leverage it up to Category Theory understanding from the mathematical point of view,  in the mean time as trying some things from the ground, as to say from nearly real world examples. 

I am also convinced that part of defiance to the monadic world comes from the name of the concept itself. Trying to work on a canonical example I realized that after all we are dealing with Design Patterns just named after a mathematical domain. Quoting the gang of four Monads should only be "recurring solutions to common problems in software design".
Naturally people will be less frightened by the name "Decorator Pattern" than by the expression "Monad Pattern". "Decorator" remains a familiar term, a kind of more everyday life expression while the word Monad looks like magic mysterious formula.

Let's (nearly) forget about the name and focus on the today's canonical example. Acting using Habelson and Sussman wishful thinking we will not use tests today, just exploration, based on correct compilation of our code until we run a sample. The canonical example is a simple Logger.
Many may have written about that in Scala, and I guess for the impatient ones you should jump to Tony Morris' blog here to get a full brilliant implementation.
For the others, let's work on the current exploration and why not express critics in the comments so to improve the implementation. The following example is an exploration and surely can be improved. So do comment and do feedback if you feel the urge to. We may travel together to this new land together :) .

We would like to execute a pure function while logging information about different steps of the process. Logging can become a very serious side effect source as you might want to log to external systems, flush information into files etc. For the sake of simplicity just imagine we would append the logs. For example multiplying 3 and 5 while logging could produce the following result:

[15, Got number 3 Got number 5 ]

where we clearly have traced the fact that 3 was created fist and then 5 . Providing 3 and 5 to the multiplication operator will always provide the same result, but here we have been able to trace the  input processes at the origin of  the values. 
Our approach lays onto the use of a context holding both the processed value and the cumulative logs. One must be able to extract on will the embedded value while continuing on appending the log information. This is when using higher order function handling functions, as first class citizens saves us once again. 
How ? Let's imagine we have can handle such a container of type M, container holding both a value of type T and a log, a kind of

Container[T, M[_]]

A first step in the pattern implementation implies that we can be able to split the responsibilities between creating container instances and applying the pure function alone to the embedded value. The following map method on our container can grant this separation:

 def map[U](f: T => U): M[U];

nearly the same map as in the List class definition for example or every Traversable implementation in scala. Having applied internally the f pure function to the method execution, , the container handler must be able to produce a new container embedding a new value of type U. At this step the container should implement a set of behaviors in concordance to what is called a Functor. As we are growing our knowledge step by step, assume we conform to the expected behavior and try to define the following:

package com.promindis.patterns

trait Functor[T, M[_]] {
  def map[U](f: T => U): M[U];

Implicitly, I suppose that my container of type M will also implement the functor trait. Joshua Suereth provides more elegant ways to separate the container implementation from the pattern traits definition itself. For now, we just try to progress in our little exploration. 
That's nice, but our multiplication hosts two operands and we are just mapping on embedded value to another ! As a tribute to Learn You a Haskell(LYAH), and supposedly naming our Logging context StringWriter, imagine having at our hand the following value factory:

def logNumber(x: Int) = StringWriter(x, "Got number " + x + " ")

We could execute our multiplication :

logNumber(5).map(y => x * y))

Where the hell would come the y from ? Could we simply imagine producing that :

logNumber(3).map(x => logNumber(5).map(y => x * y))

Well... almost because the type of the produced result would be  StringWriter[StringWriter[Int]] and not  StringWriter[Int]. This flattening operation is then provide by the Monad Pattern per se (hurray !) as a flatten operation :

def flatten[T](m: M[M[T]]): M[T]

Just a minute ! This is it? No, as you can imagine , Monads should obey rules as their cousin the Functors, but from now I just propose you to use the following method

def flatMap [U](f: T => M[U])

simplifying our previous expression:

logNumber(3).flatMap(x => logNumber(5).map(y => x * y))

That could be chained up to the infinite (and beyond ...):

logNumber(3).flatMap(x => logNumber(5).flatMap( y => logNumber(7).map(z => x * y * z)))

and so on and so on etc... producing at each flatMap function invocation a new StringWriter[Int] typed container instance. 
The beauty from my point of view, comes from the separation of the extraction/insertion of the value into the working context from the function application (z => x * y * z), cleanly isolated. The complete definition I provided was

package com.promindis.patterns

trait MonadHelper[M[_]] {
  def flatten[T](m: M[M[T]]): M[T]

trait Monad[T, M[_]] extends Functor[T, M]{
  def flatMap [U](f: T => M[U])(implicit helper: MonadHelper[M]): M[U] = helper.flatten(map(f))

I voluntarily split the flatten function into a second trait as this generic operation is bound only to the type of the container. I expect the compiler to check at compile time that implicit object helpers are available in the scope. (I don't like implicits specifically but they can become very handy if not over used) As I expect my StringWriter to become some day reusable, I reused one of my existing traits, the Monoid to define my writer trait as in LYAH. Monoid not being the purpose of today blog, just keep in mind the LYAH definition remembering that these are simple structure allowing to isolate in handy way the manipulation of elements of the same type versus a simple function operation:

package com.promindis.patterns

trait Monoid[T] {
  def add(x: T, y: T): T
  def unit: T
  def concat(xs: Traversable[T]): T =
    if (xs.isEmpty) unit else xs.foldLeft(unit){add(_, _)}

object Monoid {
  implicit object StringMonoid extends Monoid[String] {
    override def add(x: String, y: String) = x + y
    override def unit = ""

//  implicit object IntegerMonoid extends Monoid[Int] {
//    override def add(x: Int, y: Int) = x + y
//    override def unit = 0
//  }
//  implicit def listMonoid[T]: Monoid[List[T]] = new Monoid[List[T]] {
//    override def add(x: List[T], y: List[T]) = x ++ y
//    override def unit = Nil
//  }
//  def concatenate[T](xs: Traversable[T])(implicit monoid: Monoid[T]) = monoid.concat(xs)

As you have guessed, the StringMonoid definition offers us all the operations required to achieve String concatenation. 
Then comes the Writer trait definition :

package com.promindis.Logger

import com.promindis.patterns.Monoid

trait Writer[T, M] {
  val value: T
  val log: M
  def context: (T, M) = (value, log)
  def combined(otherLog: M )(implicit monoid: Monoid[M]) = monoid.add(log, otherLog)

and its implementation as both a Functor and a Monad

package com.promindis.Logger

import com.promindis.patterns.{MonadHelper, Monoid, Monad}

class StringWriter[T](val value: T, val log: String)  
extends Writer[T, String] with Monad[T, StringWriter]{

  def this(value: T)(implicit monoid: Monoid[String]) = this(value, monoid.unit)

  override def map[U](f: (T) => U) =  StringWriter(f(context._1), context._2)

  override def toString = "[" + value + ", " + log + "]"

object StringWriter {
  def apply[T](value: T) = new StringWriter[T](value)

  def apply[T](value: T, log: String) = new StringWriter[T](value, log)

  implicit object writerMonadHelper extends MonadHelper[StringWriter] {
    def flatten[T](m: StringWriter[StringWriter[T]]) = {
      val pair = m.context
      val innerW = pair._1
      val innerPair = innerW.context
      StringWriter(innerPair._1, m.combined(innerPair._2))

  • The StringWriter class will implicitly use the String concatenating monoid.
  • The StringWriter class is itself both a Monad and a Functor
  • The specific Monad helper is defined in the scope of the companion object as an implicit internal object writerMonadHelper  for the compiler to implicitly bind it to the flatMap method of the Monad trait
This is when we can play, and where the magic of the Scala compiler plays a nice part. 
We can of course reproduce the LYAH example using the chained invocation of flatMap/map methods, but as we implement both the methods flatMap and map, the Scala environment permits us to use for comprehension, in extenso:

package com.promindis.user

import com.promindis.Logger.StringWriter

object UseWriter {

  import StringWriter._

  def logNumber(x: Int) = StringWriter(x, "Got number " + x + " ")

  def main(arguments: Array[String]) {
    val value = for {
      a <- logNumber(3)
      b <- logNumber(5)
    } yield a * b

      logNumber(3).flatMap(x =>
        logNumber(5).map(y =>
          x * y)))

    val anotherValue = for {
      a <- logNumber(3)
      b <- logNumber(5)
      c <- logNumber(7)
    } yield a * b * c

      logNumber(3).flatMap(x =>
        logNumber(5).flatMap(y =>
          logNumber(7).map(z =>
            x * y * z))))

That shall produce

[info] Running com.promindis.user.UseWriter
[15, Got number 3 Got number 5 ]
[15, Got number 3 Got number 5 ]
[105, Got number 3 Got number 5 Got number 7 ]
[105, Got number 3 Got number 5 Got number 7 ]

Not fond of the imperative style of the for comprehension, I concede that their use does clarify the code. Unlike Tony Morris I do not have provided more detailed example of usage but promise to work on them as soon as I will have ended writing the blog. 

Thank you for following me into this first exploration and do not hesitate to produce helping comments. 

Be seeing you !!! :)


Post a Comment