1
I Use This!
Inactive

News

Analyzed 2 days ago. based on code collected 3 days ago.
Posted almost 7 years ago
Since first releasing Rillet.js, my Javascript streams library, back at the start of May, I've carried on working on it and some how managed to produce, at time of writing, a further 10 releases of one sort or another. While some of ... [More] those are trivial documentation fixes, I have added quite a number of additional terminal methods and a stream modifier too. The motivation for part of this work has been to provide parity with JavaScript's Array class. If you can do it eagerly with an Array, you should be able to do it lazily with Rillet, hence the addition of every, some, none, and join. I've drawn inspiration (by which I mean copied) from other streams libraries, including .NET's Linq, Java 8 Streams, Python iterators, and my own previous work, which so far has resulted in the max, min, and sum terminators. Lastly, and perhaps most importantly coming as it does from work I've been doing, is the modifier method uniq, which strips duplicates from the stream. Of course, all modifiers can be written in terms of filter or map, and all terminators can be written in terms of reduce, sometimes trivially so. Here's max seq.reduce((a,b) => (a > b) ? a : b, Nill); and sum, which is even simpler seq.reduce((a,b) => Number(a)+Number(b), 0); Why bother, then? The first reason, and it's a pretty important one, is that's it's simply more expressive. What a call to sum does is pretty obvious, while reduce((a,b) => Number(a)+Number(b), 0), although in way obscure still requires you to think a bit. This is even more the case for something like seq.filter(function() { const seen = {}; return i => { if(seen[i]) return false; seen[i] = true; return true; } }()).... which is a whole lot wordier, significantly less obvious than seq.uniq().... and also contains a subtle bug. Correctness, then, is another reason. Each time you have to write an extra bit of code, you risk getting it wrong. The more the library can do for you, the fewer bugs you'll write. Finally, there are cases where a specialised implementation can be more efficient. Rillet's none, some, and every can all return early when appropriate, while the same tests expressed in terms of reduce would have to continue until the sequence was exhausted. For most cases I expect the execution time difference to be trivial, but when it does count, it could be significant. When I first put Rillet together it wasn't really much more than a little demonstration to support my ACCU talk. Rather to my surprise, my recent work has all been in JavaScript, and so I've been using it anger. It's working out rather well so far. [Less]
Posted almost 7 years ago
Since first releasing Rillet.js, my Javascript streams library, back at the start of May, I've carried on working on it and some how managed to produce, at time of writing, a further 10 releases of one sort or another. While some of ... [More] those are trivial documentation fixes, I have added quite a number of additional terminal methods and a stream modifier too. The motivation for part of this work has been to provide parity with JavaScript's Array class. If you can do it eagerly with an Array, you should be able to do it lazily with Rillet, hence the addition of every, some, none, and join. I've drawn inspiration (by which I mean copied) from other streams libraries, including .NET's Linq, Java 8 Streams, Python iterators, and my own previous work, which so far has resulted in the max, min, and sum terminators. Lastly, and perhaps most importantly coming as it does from work I've been doing, is the modifier method uniq, which strips duplicates from the stream. Of course, all modifiers can be written in terms of filter or map, and all terminators can be written in terms of reduce, sometimes trivially so. Here's max seq.reduce((a,b) => (a > b) ? a : b, Nill); and sum, which is even simpler seq.reduce((a,b) => Number(a)+Number(b), 0); Why bother, then? The first reason, and it's a pretty important one, is that's it's simply more expressive. What a call to sum does is pretty obvious, while reduce((a,b) => Number(a)+Number(b), 0), although in no way obscure still requires you to think a bit. This is even more the case for something like seq.filter(function() { const seen = {}; return i => { if(seen[i]) return false; seen[i] = true; return true; } }()).... which is a whole lot wordier, significantly less obvious than seq.uniq().... and also contains a subtle bug. Correctness, then, is another reason. Each time you have to write an extra bit of code, you risk getting it wrong. The more the library can do for you, the fewer bugs you'll write. Finally, there are cases where a specialised implementation can be more efficient. Rillet's none, some, and every can all return early when appropriate, while the same tests expressed in terms of reduce would have to continue until the sequence was exhausted. For most cases I expect the execution time difference to be trivial, but when it does count, it could be significant. When I first put Rillet together it wasn't really much more than a little demonstration to support my ACCU talk. Rather to my surprise, my recent work has all been in JavaScript, and so I've been using it anger. It's working out rather well so far. [Less]
Posted almost 7 years ago
In my recent talk at ACCU 2017 I spent some time talking about generators. Generators are, of all the things introduced in ES2015, probably the most important single feature. While it's always possible to do things in a programming ... [More] language, often we don't because it seems like hard work or even because it simply doesn't occur to us. The ES2015 generators say, explicitly, hey kids, this is a good idea right here. It opens the door to a new type of programming in JavaScript, and will I hope lower the impedance mismatch that often exists between different libraries. By a new type of programming in JavaScript I, of course, mean a type of programming that's been around approximately for ever but which is now easy and attractive in JavaScript. I'm talking about functional programming, a style of programming that avoids mutable data and changing state, widely popularised by Lisp (and then subsequently ignored for a long time because John McCarthy was just so absurdly clever it took the rest of us an age to really understand what he was talking about). More specifically, I'm talking about functional programming with streams which, in addition to sidestepping mutable state, can also be spendidly expressive and concise. Here's an actual question I found on Stack Overflow - a=[{'report1':{'name':'Texas'}},{'report2':{'name':'Virginia'}}] From the above javascript object I need to extract only 'report1' object. I can use foreach to iterate over 'a' but somehow need to extract 'report1' object. One of the suggested answers is var report1 = null; for ( var i = 0, len = a.length; i < len; i++ ) { if ( 'report1' in a[i] ) { report1 = a[i].report1; break; } } which, while correct and maybe even obvious, can't match the lovely expressiveness of var obj = a.find(o => o['report1']); This answer uses the find method which ES2015 adds to the Array class. JavaScript already had map, filter and reduce methods and together with find they provide the core functional programming tools that have delighted Lisp programmers for so long. So problem solved, right? Why am I banging on about generators? Array.find, Array.map, Array.reduce and friends are great and all, but the clue is in the name - Array. They work with arrays, so your data needs to be in an array. There's also an issue in the implementation. Consider this example, finding the first even number in an array with filled with random numbers - const arr = function_returning_an_array_of_random_numbers(); const first_even_number = arr.filter(n => n%2==0)[0]; Neat and elegant, right? Well, yes and no. Think about the filter method - what does it return? As the [0] on the end there shows, it returns an array. The filter method is, therefore, operating over the whole of arr building a new array which it returns, and we, in this case, index into to grab the first item. That's unnecessary, redundant, work. If arr isn't that long, it probably isn't an issue, but what if arr is long? That extra work might start to take noticable time or memory. What if arr is infinitely long? Crazy talk, right? An array can't be infinitely long, it's true, but a stream can be. And this is where generators enter the pictures. JavaScript generators provide are really an interface to a sequence, potentially infinitely long, of data. The source of the data could be a database cursor, a web socket, a file, a calculation, almost anything. In the example above if instead of a function returning an array of random numbers, we had a generator producing a sequence of random numbers and we want to find the first even number in that sequence function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } let first_even_number; for (const n of randomInterval()) if (n%2 == 0) { first_even_number = n; break; } Oh, that's worse. This is where (at last!) rillet.js comes in. Through the well established technique of applying another level of indirection - in this case throwing more generators at things - we can rillet-ify (not a real word) this to const from = require('./rillet.js').from; // pull in module function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } const first_even_number = from(randomInterval()).filter(n => n%2==0).first(); What the Rillet library provides is a thin wrapper around a generator with some additional methods on it. Rillet methods include map, filter, and so on, and they operate similarly to the Array methods with the same name, with one important exception. Instead of returning an array, they return another Rillet instance, ie another generator. In the code above, from is simply a factory method that calls the Rillet constructor. The class itself is the thinnest possible wrapper around the generator. class Rillet { constructor(iterable) { this.iterable = iterable; } // constructor *[Symbol.iterator]() { yield* this.iterable; } // more of the class } A filter generator that wraps around a generator and only yields values matching the supplied predicate is straightforward function* filter(iterable, predicate) { for (const a of iterable) if (predicate(a)) yield a; } // filter Rillet's filter method just throws a new Rillet object around that ... class Rillet { // constructor, etc filter(predicate) { return new MangoRange(filter(this.iterable, predicate)); } // more of the class } Rillet map and other methods work similarly. In this way, we can built up arbitrarily complex calculation pipelines. This effect of this is two fold - first, as I said above, we can handle arbitrarily long sequences of data, and second, no work is done until we call a method that actually exercises the pipeline. Furthermore, when we do finally exercise the pipeline the smallest possible amount of work is done. Working with arrays is maximalist - everything is eagerly evaluated - while working with streams is minimalist - everything is lazily evaluated. At time of writing, it's a library in progress so rather than describe every method I'm going to embed the README here so it'll always be up to date. ... The Rillet source is available on GitHub. It's also available as an npm package npm install rillet [Less]
Posted almost 7 years ago
In my recent talk at ACCU 2017 I spent some time talking about generators. Generators are, of all the things introduced in ES2015, probably the most important single feature. While it's always possible to do things in a programming ... [More] language, often we don't because it seems like hard work or even because it simply doesn't occur to us. The ES2015 generators say, explicitly, hey kids, this is a good idea right here. It opens the door to a new type of programming in JavaScript, and will I hope reduce the impedance mismatch that often exists between different libraries. By a new type of programming in JavaScript I, of course, mean a type of programming that's been around approximately for ever but which is now easy and attractive in JavaScript. I'm talking about functional programming, a style of programming that avoids mutable data and changing state, widely popularised by Lisp (and then subsequently ignored for a long time because John McCarthy was just so absurdly clever it took the rest of us an age to really understand what he was talking about). More specifically, I'm talking about functional programming with streams which, in addition to sidestepping mutable state, can also be spendidly expressive and concise. Here's an actual question I found on Stack Overflow - a=[{'report1':{'name':'Texas'}},{'report2':{'name':'Virginia'}}] From the above javascript object I need to extract only 'report1' object. I can use foreach to iterate over 'a' but somehow need to extract 'report1' object. One of the suggested answers is var report1 = null; for ( var i = 0, len = a.length; i < len; i++ ) { if ( 'report1' in a[i] ) { report1 = a[i].report1; break; } } which, while correct and maybe even obvious, can't match the lovely expressiveness of var obj = a.find(o => o['report1']); This answer uses the find method which ES2015 adds to the Array class. JavaScript already had map, filter and reduce methods and together with find they provide the core functional programming tools that have delighted Lisp programmers for so long. So problem solved, right? Why am I banging on about generators? Array.find, Array.map, Array.reduce and friends are great and all, but the clue is in the name - Array. They work with arrays, so your data needs to be in an array. There's also an issue in the implementation. Consider this example, finding the first even number in an array with filled with random numbers - const arr = function_returning_an_array_of_random_numbers(); const first_even_number = arr.filter(n => n%2==0)[0]; Neat and elegant, right? Well, yes and no. Think about the filter method - what does it return? As the [0] on the end there shows, it returns an array. The filter method is, therefore, operating over the whole of arr building a new array which it returns, and we, in this case, index into to grab the first item. That's unnecessary, redundant, work. If arr isn't that long, it probably isn't an issue, but what if arr is long? That extra work might start to take noticable time or memory. What if arr is infinitely long? Crazy talk, right? An array can't be infinitely long, it's true, but a stream can be. And this is where generators enter the pictures. JavaScript generators provide a really interface to long, potentially infinitely long, sequence of data. The source of the data could be a database cursor, a web socket, a file, a calculation, almost anything. In the example above if instead of a function returning an array of random numbers, we had a generator producing a sequence of random numbers and we want to find the first even number in that sequence function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } let first_even_number; for (const n of randomInterval()) if (n%2 == 0) { first_even_number = n; break; } Oh, that's worse. This is where (at last!) rillet.js comes in. Through the well established technique of applying another level of indirection - in this case throwing more generators at things - we can rillet-ify (not a real word) this to const from = require('./rillet.js').from; // pull in module function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } const first_even_number = from(randomInterval()).filter(n => n%2==0).first(); What the Rillet library provides is a thin wrapper around a generator with some additional methods on it. Rillet methods include map, filter, and so on, and they operate similarly to the Array methods with the same name, with one important exception. Instead of returning an array, they return another Rillet instance, ie another generator. In the code above, from is simply a factory method that calls the Rillet constructor. The class itself is the thinnest possible wrapper around the generator. class Rillet { constructor(iterable) { this.iterable = iterable; } // constructor *[Symbol.iterator]() { yield* this.iterable; } // more of the class } A filter generator that wraps around a generator and only yields values matching the supplied predicate is straightforward function* filter(iterable, predicate) { for (const a of iterable) if (predicate(a)) yield a; } // filter Rillet's filter method just throws a new Rillet object around that ... class Rillet { // constructor, etc filter(predicate) { return new MangoRange(filter(this.iterable, predicate)); } // more of the class } Rillet map and other methods work similarly. In this way, we can built up arbitrarily complex calculation pipelines. This effect of this is two fold - first, as I said above, we can handle arbitrarily long sequences of data, and second, no work is done until we call a method that actually exercises the pipeline. Furthermore, when we do finally exercise the pipeline the smallest possible amount of work is done. Working with arrays is maximalist - everything is eagerly evaluated - while working with streams is minimalist - everything is lazily evaluated. At time of writing, it's a library in progress so rather than describe every method I'm going to embed the README here so it'll always be up to date. ... The Rillet source is available on GitHub. It's also available as an npm package npm install rillet [Less]
Posted almost 7 years ago
In my recent talk at ACCU 2017 I spent some time talking about generators. Generators are, of all the things introduced in ES2015, probably the most important single feature. While it's always possible to do things in a programming ... [More] language, often we don't because it seems like hard work or even because it simply doesn't occur to us. The ES2015 generators say, explicitly, hey kids, this is a good idea right here. It opens the door to a new type of programming in JavaScript, and will I hope reduce the impedance mismatch that often exists between different libraries. By a new type of programming in JavaScript I, of course, mean a type of programming that's been around approximately for ever but which is now easy and attractive in JavaScript. I'm talking about functional programming, a style of programming that avoids mutable data and changing state, widely popularised by Lisp (and then subsequently ignored for a long time because John McCarthy was just so absurdly clever it took the rest of us an age to really understand what he was talking about). More specifically, I'm talking about functional programming with streams which, in addition to sidestepping mutable state, can also be spendidly expressive and concise. Here's an actual question I found on Stack Overflow - a=[{'report1':{'name':'Texas'}},{'report2':{'name':'Virginia'}}] From the above javascript object I need to extract only 'report1' object. I can use foreach to iterate over 'a' but somehow need to extract 'report1' object. One of the suggested answers is var report1 = null; for ( var i = 0, len = a.length; i < len; i++ ) { if ( 'report1' in a[i] ) { report1 = a[i].report1; break; } } which, while correct and maybe even obvious, can't match the lovely expressiveness of var obj = a.find(o => o['report1']); This answer uses the find method which ES2015 adds to the Array class. JavaScript already had map, filter and reduce methods and together with find they provide the core functional programming tools that have delighted Lisp programmers for so long. So problem solved, right? Why am I banging on about generators? Array.find, Array.map, Array.reduce and friends are great and all, but the clue is in the name - Array. They work with arrays, so your data needs to be in an array. There's also an issue in the implementation. Consider this example, finding the first even number in an array with filled with random numbers - const arr = function_returning_an_array_of_random_numbers(); const first_even_number = arr.filter(n => n%2==0)[0]; Neat and elegant, right? Well, yes and no. Think about the filter method - what does it return? As the [0] on the end there shows, it returns an array. The filter method is, therefore, operating over the whole of arr building a new array which it returns, and we, in this case, index into to grab the first item. That's unnecessary, redundant, work. If arr isn't that long, it probably isn't an issue, but what if arr is long? That extra work might start to take noticable time or memory. What if arr is infinitely long? Crazy talk, right? An array can't be infinitely long, it's true, but a stream can be. And this is where generators enter the pictures. JavaScript generators provide a really interface to long, potentially infinitely long, sequence of data. The source of the data could be a database cursor, a web socket, a file, a calculation, almost anything. In the example above if instead of a function returning an array of random numbers, we had a generator producing a sequence of random numbers and we want to find the first even number in that sequence function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } let first_even_number; for (const n of randomInterval()) if (n%2 == 0) { first_even_number = n; break; } Oh, that's worse. This is where (at last!) rillet.js comes in. Through the well established technique of applying another level of indirection - in this case throwing more generators at things - we can rillet-ify (not a real word) this to const from = require('./rillet.js').from; // pull in module function* randomInterval() { for(;;) yield Math.floor(Math.random()*101); // random number between 0 and 100 } const first_even_number = from(randomInterval()).filter(n => n%2==0).first(); What the Rillet library provides is a thin wrapper around a generator with some additional methods on it. Rillet methods include map, filter, and so on, and they operate similarly to the Array methods with the same name, with one important exception. Instead of returning an array, they return another Rillet instance, ie another generator. In the code above, from is simply a factory method that calls the Rillet constructor. The class itself is the thinnest possible wrapper around the generator. class Rillet { constructor(iterable) { this.iterable = iterable; } // constructor *[Symbol.iterator]() { yield* this.iterable; } // more of the class } A filter generator that wraps around a generator and only yields values matching the supplied predicate is straightforward function* filter(iterable, predicate) { for (const a of iterable) if (predicate(a)) yield a; } // filter Rillet's filter method just throws a new Rillet object around that ... class Rillet { // constructor, etc filter(predicate) { return new MangoRange(filter(this.iterable, predicate)); } // more of the class } Rillet map and other methods work similarly. In this way, we can built up arbitrarily complex calculation pipelines. This effect of this is two fold - first, as I said above, we can handle arbitrarily long sequences of data, and second, no work is done until we call a method that actually exercises the pipeline. Furthermore, when we do finally exercise the pipeline the smallest possible amount of work is done. Working with arrays is maximalist - everything is eagerly evaluated - while working with streams is minimalist - everything is lazily evaluated. At time of writing, it's a library in progress so rather than describe every method I'm going to embed the README here so it'll always be up to date. ... The Rillet source is available on GitHub. It's also available as an npm package npm install rillet [Less]
Posted almost 7 years ago
ES6 is (almost) the most recent version of the language most commonly known as Javascript. Its publication in 2015 was the first update to Javascript since 2009 and brought a number of pretty radical revisions to both language and ... [More] library. This session takes a look at some of the most significant features, the impact they have on the way we write Javascript, how we can start using them today, why we should, and a look forward to Javascript’s future evolution. I presented this talk at ACCU Conference 2017. Slides and notes (it's a reveal.js deck, so press 'S' to get the speaker notes) Source of the Java Streams/.NET Linq style library A longer discussion of the library The coroutines sample source I first gave this talk, in a rather shorter form (but with a more energetic delivery) at Nor(DEV):con in February 2016. That was also captured for posterity and is available on InfoQ [Less]
Posted almost 7 years ago
ES6 is (almost) the most recent version of the language most commonly known as Javascript. Its publication in 2015 was the first update to Javascript since 2009 and brought a number of pretty radical revisions to both language and ... [More] library. This session takes a look at some of the most significant features, the impact they have on the way we write Javascript, how we can start using them today, why we should, and a look forward to Javascript’s future evolution. I presented this talk at ACCU Conference 2017. Slides and notes (it's a reveal.js deck, so press 'S' to get the speaker notes) Source of the Java Streams/.NET Linq style library The coroutines sample source I first gave this talk, in a rather shorter form (but with a more energetic delivery) at Nor(DEV):con in February 2016. That was also captured for posterity and is available on InfoQ [Less]
Posted almost 7 years ago
ES6 is (almost) the most recent version of the language most commonly known as Javascript. Its publication in 2015 was the first update to Javascript since 2009 and brought a number of pretty radical revisions to both language and ... [More] library. This session takes a look at some of the most significant features, the impact they have on the way we write Javascript, how we can start using them today, why we should, and a look forward to Javascript’s future evolution. I presented this talk at ACCU Conference 2017. Slides and notes (it's a reveal.js deck, so press 'S' to get the speaker notes) Source of the Java Streams/.NET Linq style library The coroutines sample source I first gave this talk, in a rather shorter form (but with a more energetic delivery) at Nor(DEV):con in February 2016. That was also captured for posterity and is available on InfoQ [Less]
Posted about 7 years ago
A longitudinal study Recorded this year at 15:55 on Sunday, 26 February. That's early - the fourth earliest recorded - and reflects the relatively mild winter. Analysis! The full ice cream van data is available in this ... [More] spreadsheet Previous years: 16:19, Sunday 10 April 2016 14:16, Sunday 22 March 2015 16:25, Friday 28 March 2014 16:00, Sunday 17 February 2013 18:52, Sunday 15 April 2012 16:45, Tuesday 01 March 2011 ~15:00, Thursday 04 March 2010 16:53, Saturday 07 March 2009 15:52, Saturday 23 February 2008 16:43, Saturday 3 March 2007 16:23, Tuesday 21 March 2006 13:38, Monday 14 February 2005 17:14, Friday 13 February 2004 [Less]
Posted over 7 years ago
To tell the truth, I have no idea. Development is of Mangle, Arabica's XSLT engine, is ongoing, although progress varies according to the vagarities of how busy I am, how energetic I'm feeling, whether the kids have a swimming gala, and so on and ... [More] so forth. Although it's not done yet, it might well be done enough. I'm using the OASIS XSLT test suite to help drive development, and so it also provides a measure of how much has been done, what's working and what isn't. The results are published here, but all the code and test data is included in the download. The executive summary is the core stuff that you use every day works, but some of the bits round the edges (edges defined by my experience, anyway) are missing. To my knowledge there's nothing that causes Mangle to crash, and anything that I haven't yet implemented generates a warning when the stylesheet is compiled. Give it a go. It might do what you need. [Less]