I Use This!
Activity Not Available


Analyzed about 6 years ago. based on code collected almost 7 years ago.
Posted almost 6 years ago by dj2
There has been a lot of crazy stuff going on the last few weeks. Just a quick update on what’s going on in this neck of the woods. First, PostRank has been acquired by Google which is pretty awesome. This has the added note that, we’ll be moving to ... [More] California in the coming months. Lots to do, house to sell, stuff to move. Both Stacy and I are pretty excited by this set of events, plus, you know, no snow. That’s definitely a bonus. The other piece of news is that, my first PeepCode screencast has been published. You can go checkout Meet EventMachine: Part I if you’d like to learn more about EventMachine. The second part should be published soon. This has been a long time coming, my fault given I was dragging my heals shooting the initial video. Geoffrey from PeepCode has been absolutely awesome through this whole process and the finalized video is much cooler then I originally envisioned. That’s the nutshell. Hopefully things will calm down soon. [Less]
Posted about 6 years ago by dj2
Over at PostRank Labs we’ve released the current version of our API server framework to the wild. Allow me to introduce Goliath. Goliath is built on the back of the work at PostRank, Thin, Sinatra, http-parser.rb and various other projects. We’ve ... [More] been using a version of Goliath internally for over a year serving a sustained 500 requests/sec and shuttling around gigabytes of data. How did we get here you’re asking? Our API servers from the beginning have been built on the awesome EventMachine library. The last several iterations have used a threaded model. A request comes in, a thread is grabbed from the pool, request is processed, thread goes back to the pool. This was cool and worked quite well for us for several years. The problem we ran into, we were getting spaghetti code. A lot of our APIs call other APIs internally. Handling the callbacks, error backs and various other delayed operations was getting difficult to maintain. Then along wandered Ruby 1.9 with these cool little things called Fibers. Fibers allow us to build synchronous looking code backed by asynchronous calls. To achieve this we’re using the EM-Synchrony library. Tying this back into Goliath, as each request comes in, a new Fiber is created, the request is processed within that Fiber and, when completed, the Fiber goes away. Once we had the foundation in place we went a step further and built a few extra little goodies into Goliath including streaming requests and responses. This lets us do some cool things like hook AMQP exchanges up to a Goliath API to pipe content through a streaming response. Using http-parser.rb allows us to open up the option of alternate Ruby VMs. At the time of writing, Goliath has been tested with MRI, JRuby and Rubinius. MRI is currently the fastest VM for Goliath but there is lots of work going on in the alternate VMs to make them faster. Ok, ok, enough of the hype, lets see some code already. Well, first we need to get Goliath installed. In general this is pretty simple, you can use your system Ruby, RVM or some other system and all you need to do is: gem install goliath What we’re going to build here is an HTTP proxy that stores information about each request/response into MongoDB. (The full code listing can be found in this gist.) Before we dig into the code, a little on how Goliath works. By default, Goliath will execute your API when you run the API file. In order to do this, the API file has to have a snake cased name based on the class name. So, our HttpLog API will live in a file called http_log.rb. When you need to add configuration to your API you’ll do this by creating a config directory in the same directory as your file. You then create a file in the config directory with the same name as your API. In our case, we’ll create a config/http_log.rb file to hold our configuration. Goliath has been built to be Rack-aware. This means we can make use of Rack middlewares, as long as they’re async safe or have been wrapped by async-rack. We’re going to build some specs along with our API. Create spec/http_log_spec.rb and we can get this party started. require ‘rubygems’ require ‘goliath/test_helper’ require File.expand_path(File.join(File.dirname(__FILE__), ‘..’, ‘http_log’)) describe HttpLog do   include Goliath::TestHelper   let(:err) { Proc.new { |c| fail "HTTP Request failed #{c.response}" } }   it ‘responds to requests’ do     with_api(HttpLog) do       get_request({}, err) do |c|         c.response_header.status.should == 200       end     end   end end With Goliath we’ve provided a simple goliath/test_helper file that makes the testing of APIs a little easier. Once you’ve required the library, you just need to include Goliath::TestHelper and you can start using the helper methods. These include the with_api and get_request methods that are being used above. The with_api call will start an instance of our API running on port 9000 and anything done within its block will executing inside an EM reactor. This method will not return until stop is executed. We’re, essentially, creating an integration test suite for our API. Everything except for the API launching will be tested by the test suite. The configuration files will be parsed, all the middlewares will be loaded and executed. Everything that your normal application would do except for the launching. Calling get_request will send a GET to our API. The first parameter is any query parameters to send to the API. In our case, we send nothing. The second is an error handler to add. This is fired if we are unable to communicate with the API. For this example, we’re just using the RSpec fail method to fail the spec in the error handler. The block passed to get_request will be executed and provided the request object when the request is complete. This allows us to look at the response headers and response body. In this case, we’re just verifying that server responds with a 200 code. When you’re using get_request or post_request a default error and callback handler will be installed that calls stop for you. With that, running the spec should fail. Let’s create a simple API to make it pass. The following code in http_log.rb will give us a passing spec. #!/usr/bin/env ruby require ‘rubygems’ require ‘goliath’ class HttpLog < Goliath::API   def response(env)     [200, {}, "Hello"]   end end You can execute the API with ./http_log.rb -sv which will launch the API on a default port of 9000. The -sv flags tell the API to log to STDOUT (s) and use verbose logging (v). You can run ./http_log.rb -h to see a list of all the default options provided by Goliath. You can add your own options as well if you need special argument handling for your API. With the API running we can query the API using curl. dj2@titania ~ $ curl localhost:9000 Hello Running the spec test again our test case should also pass at this point. To make our lives easier, we’re going to use the Rack::Reloader middleware to handle reloading our Rack middleware on each request. This is done by adding use ::Rack::Reloader, 0 into our API. Since we only want reloading when we’re doing development we’re going to make the use statement conditional on the Goliath environment. By default Goliath executes in a development environment. We can confirm this by calling Goliath.dev? which will return true if the current environment is development. class HttpLog < Goliath::API   use ::Rack::Reloader, 0 if Goliath.dev?   def response(env)     [200, {}, "Hello"]   end end You’ll notice that we didn’t create another file to place the middleware. With Goliath, the middleware is built into the API class itself. We’ve found this is a lot easier to deal with when you keep everything together in the same file. (You can also add plugins in the same fashion by using the plugin keyword.) The next step is to start forwarding to our backend server. To do this, we’ll specify in our configuration file the forwarder URL. Create config/http_log.rb and add the following: config[‘forwarder’] = ‘http://localhost:8080′ The config hash allows us to store configuration data that will become available through the env parameter to the response method. In this case, we’ve specified we want our forwarder to send requests to http://localhost:8080. In order to test this, I’m going to create a Goliath API server inside our spec files. I’ll run the server on port 8080 and check for our server response in the response from our API. Add the following to the top of the spec file. class Responder < Goliath::API   def response(env)     [200, {"Special" => "Header"}, "Hello World"]   end end And the following spec: it ‘forwards to our API server’ do   with_api(HttpLog) do     server(Responder, 8080)     get_request({}, err) do |c|       c.response_header.status.should == 200       c.response_header[‘Special’].should == ‘Header’       c.response.should == ‘Hello from Responder’     end   end This is similar to our previous spec except I’ve added a server(Responder, 8080) call to launch an instance of our Responder API on port 8080. Running this test should fail since we aren’t sending or receiving any data to the proxied API. As a first pass, we’ll send the request and proxy the responses back. def response(env)   req = EM::HttpRequest.new("#{env.forwarder}").get   [req.response_header.status, req.response_header, req.response] end So, that’s cool. (Make sure you add the server(Responder, 8080) to the first example after doing this or it will end up failing.) There is one, non-apparent problem. It turns out the headers get transformed by EM-HTTP-Request. When we go to log this stuff we’ll want the non-transformed headers so we need to transform them back into the normal HTTP header format. Let’s start with a couple tests: context ‘HTTP header handling’ do   it ‘transforms back properly’ do     hl = HttpLog.new     hl.to_http_header("SPECIAL").should == ‘Special’     hl.to_http_header("CONTENT_TYPE").should == ‘Content-Type’   end end Simple enough, we need to fix the casing and change _ into -. Let’s add that to our HttpLog class. def to_http_header(k)   k.downcase.split(‘_’).collect { |e| e.capitalize }.join(‘-’) end That should make our transform tests pass and we can build the change into the response method by changing it to look like: def response(env)   req = EM::HttpRequest.new("#{env.forwarder}").get   response_headers = {}   req.response_header.each_pair do |k, v|     response_headers[to_http_header(k)] = v   end   [req.response_header.status, response_headers, req.response] end Cool, so now we’re forwarding all requests to the proxied server. There is a bit of missing information that we need, that being the headers, query parameters, request path and different types of requests from just GET. Lets start with the query parameters. To test this I’m going to augment our Responder class to return the request parameters as part of the headers. class Responder < Goliath::API   use Goliath::Rack::Params   def response(env)     query_params = env.params.collect { |param| param.join(": ") }     headers = {"Special" => "Header",                "Params" => query_params.join("|")}     [200, headers, "Hello from Responder"]   end end We’re using another middleware here, Goliath::Rack::Params, which will parse the query and body parameters and put them into the params hash of the environment. Using that we can send the parameters back as a header in the response. The test for query parameters is pretty simple, pass them in, make sure they come back: context ‘query parameters’ do   it ‘forwards the query parameters’ do     with_api(HttpLog) do       server(Responder, 8080)       get_request({:query => {:first => :foo, :second => :bar, :third => :baz}}, err) do |c|         c.response_header.status.should == 200         c.response_header["PARAMS"].should == "first: foo|second: bar|third: baz"       end     end   end end To make this pass, we just need to use Goliath::Rack::Params and change the EM::HttpRequest#get request to provide the params. This is done by using: req = EM::HttpRequest.new("#{env.forwarder}").get({:query => env.params}). Next up, let’s handle the request path. We’ll start with the spec. context ‘request path’ do   it ‘forwards the request path’ do     with_api(HttpLog) do       server(Responder, 8080)       get_request({:path => ‘/my/request/path’}, err) do |c|         c.response_header.status.should == 200         c.response_header[‘PATH’].should == ‘/my/request/path’       end     end   end end Our Responder needs to return the path in the headers. This is done by adding "Path" => env[Goliath::Request::REQUEST_PATH] into the headers return hash. Making this spec pass is as simple as using the same env[Goliath::Request::REQUEST_PATH] data in our forwarder request. req = EM::HttpRequest.new("#{env.forwarder}#{env[Goliath::Request::REQUEST_PATH]}").get(params) With the request path out of the way, let’s look into the headers. Goliath has built in facilities to do streaming requests. As part of that, we can hook into the on_headers method to receive a callback with the headers when they’re available. We can then store them into the environment and return them in a similar fashion to the query parameters. We’re going to use the same on_headers callback in the Responder class to give us access to the headers. Then, the same as with query params, we’ll return them in a header for the response. class Responder < Goliath::API   use Goliath::Rack::Params   def on_headers(env, headers)     env[‘client-headers’] = headers   end   def response(env)     query_params = env.params.collect { |param| param.join(": ") }     query_headers = env[‘client-headers’].collect { |param| param.join(": ") }     headers = {"Special" => "Header",                "Params" => query_params.join("|"),                "Path" => env[Goliath::Request::REQUEST_PATH],                "Headers" => query_headers.join("|")}     [200, headers, "Hello from Responder"]   end end With a spec test similar to the query example as well. context ‘headers’ do   it ‘forwards the headers’ do     with_api(HttpLog) do       server(Responder, 8080)       get_request({:head => {:first => :foo, :second => :bar}}, err) do |c|         c.response_header.status.should == 200         c.response_header["HEADERS"].should =~ /First: foo\|Second: bar/       end     end   end end Let’s make the tests pass: def on_headers(env, headers)   env.logger.info ‘proxying new request: ‘ + headers.inspect   env[‘client-headers’] = headers end You’ll notice I added a call to env.logger.info. Goliath comes with built in logging capabilities. The logger is based on Log4r and is accessed through the environment. The Goliath environment is a subclass of Hash and can be accessed as such to store and retrieve information. Once we’ve got the headers stored, adding them to our request is a simple process. params = {:head => env[‘client-headers’], :query => env.params} req = EM::HttpRequest.new("#{env.forwarder}#{env[Goliath::Request::REQUEST_PATH]}").get(params) One step left and our proxy is complete. Let’s send the right request method through to the proxied server (we’re just going to do GET and POST in this example). First step, add "Method" => env[Goliath::Request::REQUEST_METHOD] to the headers returned from our Responder API. We can then add the specs: context ‘request method’ do   it ‘forwards GET requests’ do     with_api(HttpLog) do       server(Responder, 8080)       get_request({}, err) do |c|         c.response_header.status.should == 200         c.response_header["METHOD"].should == "GET"       end     end   end   it ‘forwards POST requests’ do     with_api(HttpLog) do       server(Responder, 8080)       post_request({}, err) do |c|         c.response_header.status.should == 200         c.response_header["METHOD"].should == "POST"       end     end   end end To make this pass we’ll use the Goliath::Request::REQUEST_METHOD to determine the right type of request to make. You’ll notice I’m putting the response into a resp variable, so the references to req in the rest of the method need to be update as well. req = EM::HttpRequest.new("#{forwarder}#{env[Goliath::Request::REQUEST_PATH]}") resp = case(env[Goliath::Request::REQUEST_METHOD])   when ‘GET’  then req.get(params)   when ‘POST’ then req.post(params.merge(:body => env[Goliath::Request::RACK_INPUT].read))   else p "UNKNOWN METHOD #{env[Goliath::Request::REQUEST_METHOD]}" end The only significant change from the previous version, in the POST request, we need to send the body data through to the forwarded server. The body is stored in a StringIO object which we can read. The object is stored in the environment under the Goliath::Request::RACK_INPUT key. With that done, we should now have a functioning HTTP proxy. Let’s take a look at how we can hook this up to MongoDB to give us the Log part of our the API name. In order to talk to MongoDB we’re going to use the em-mongo gem. Since we’re working in an asynchronous environment we may have several requests being processed at the same time, so we’re going to wrap our MongoDB connection into a connection pool. There is connection pool logic built into em-synchrony for us to utilize. I’m also, for the sake of this tutorial, going to say we don’t want to create a connection to a real Mongo instance when we’re running our spec tests. So, I’m going to restrict the connection creation to only happen in development mode. Make sense? Ok, add the following to your config/http_log.rb file. environment(:development) do   config[‘mongo’] = EventMachine::Synchrony::ConnectionPool.new(size: 20) do     conn = EM::Mongo::Connection.new(‘localhost’, 27017, 1, {:reconnect_in => 1})     conn.db(‘http_log’).collection(‘aggregators’)   end end The environment(:development) will cause this code block to only execute if we’re in development mode. We could have also passed :test or :production depending on which mode we want. Along with a single option, you can also provide an array. So, we could have said environment([:test, :development]) to execute in both test and development mode but not in production mode. I’m going to leave it as an exercise to you, my reader, to figure out the synchrony and mongo code we’re using in the connection. Let’s just leave it at, we will now have access to a MongoDB collection object in config['mongo'] which we can use in our application. Just before we return the status, headers and body we’re going to record the request into Mongo. We’ll do that by adding: record(process_time, resp, env[‘client-headers’], response_headers) The implementation of record uses the various environment variables we’ve seen above to do it’s work: def record(resp, client_headers, response_headers)   e = env   EM.next_tick do     doc = {       request: {         http_method: e[Goliath::Request::REQUEST_METHOD],         path: e[Goliath::Request::REQUEST_PATH],         headers: client_headers,         params: e.params       },       response: {         status: resp.response_header.status,         length: resp.response.length,         headers: response_headers,         body: resp.response       },       date: Time.now.to_i     }     if e[Goliath::Request::RACK_INPUT]       doc[:request][:body] = e[Goliath::Request::RACK_INPUT].read     end     e.mongo.insert(doc)   end end There are a couple of things to note, first, I’m doing this work in an EM.next_tick block so that I don’t block the response from returning to the client. Because I’m doing this in the next tick I’m going to lose access to the env so I tuck it away into a e variable which gets bound up with the next_tick block. Finally, the last thing we do is call e.mongo.insert(doc) which will access the config['mongo'] object we created earlier and call the insert method. With that code added, all of our tests fail. We’re going to need to create a mongo key in the environment that the tests can use. To do that, I’ve created a mock_mongo method: def mock_mongo   @api_server.config[‘mongo’] = mock(‘mongo’).as_null_object end Which I call inside the with_api(HttpLog) block. I’ll leave it as an exercise to the reader to create some specs around the record method. The default API server created inside with_api will be made available to the user through the @api_server variable. With that, you should have a working proxy logger API. If you run webserver on port 8080 and execute the our http logger you can then make requests to port 9000 and see the results logged into mongo. Looking in mongo you should see something similar to: connecting to: test > use http_log switched to db http_log > db.aggregators.find(); { "_id" : ObjectId("4d6efbbad3547d28f3000001"), "request" : { "http_method" : "GET", "path" : "/", "headers" : { "User-Agent" : "curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3", "Host" : "localhost:9000", "Accept" : "*/*", "Version" : "1.1" }, "params" : { "echo" : "test" }, "body" : "" }, "response" : { "status" : 200, "length" : 19, "headers" : { "Content-Type" : "application/json; charset=utf-8", "Server" : "PostRank Goliath API Server", "Vary" : "Accept", "Content-Length" : "19", "Date" : "Thu, 03 Mar 2011 02:23:54 GMT" }, "body" : "{\"response\":\"test\"}" }, "process_time" : 0.007593870162963867, "date" : 1299119034 } With that, have fun playing with Goliath. We’re hoping people find some interesting uses for the framework. [Less]
Posted almost 7 years ago by dj2
This post started with a point. It didn’t end with one. So, I’m going to start it again and just get to the point. What makes a good Ruby Cocoa development environment. I’ll get into my rambling thoughts below, but quickly; straight up ... [More] XCode/Interface Builder, HotCocoa, Interface Builder and command line building, some combination of all three or something else. If you were going to do it, how, or what would you do? I’ve been playing with the above three options as I’ve trying to figure this out myself. I’ve done a few straight HotCocoa applications (SilverLining, Rife and Postie) and, while they worked out well, I found the need to do all the layout by hand tedious and boring. I really like Interface Builder for ease of UI creation. Taking that, I tried the XCode/Interface Builder solution and went to the bundled MacRuby XCode templates. When programming Objective-C I really enjoy XCode, it makes things easy when doing purely native Mac applications. That said, XCode irks me when doing Ruby development. Maybe it’s muscle memory, not sure, but I like using TextMate to do my Ruby work. The approach I’ve been using in my latest fiddlings (Touchstone) started with the XCode templates but I broke out, stole some of the rake tasks from HotCocoa, and switched to TextMate and the command line. The biggest issue I’ve run into is the Interface Builder integration kinda sucks. It does work, but you have to reload the class files manually all the time. A bit frustrating. On the flip side, I can organize my code however I want. The tendency for Objective-C programs to have all their code in the root directory drives me up the wall. Sure, XCode hides this, you can change it, but, by default, everything ends up in the root folder. Maddening. With Touchstone, I’ve put all the application code, similar to Rails, under App/Controllers, App/Models, App/Helpers and App/Views. The same structure is copied into the .app file when it’s generated. Everything feels much cleaning. The .xib and .xcdatamodel files get compiled and moved as needed. I know I could have used the XCode build stuff but having to launch XCode and getting the debugging output in an XCode window didn’t feel right for me. I wanted something a bit more contained. Coming back to the original question, I think what I want to see, is something similar to Touchstone but with the convenience of HotCocoa. The HotCocoa mappings make a lot of things feel more Rubyish, feel more natural. Those bindings, combined with Interface Builder, I think, would make a kick-ass Cocoa programming environment. Again, how would you do it? [Less]
Posted almost 7 years ago by dj2
Over at PostRank we serve a lot of data through our API servers. All of those servers parse query strings. Along with, obviously, the query string, we also encode the post data in query string format. This means we’re doing a lot of parsing of query ... [More] strings. Millions of query strings a day. Parsing time for those query strings starts to add up. We started with the query string parser from Camping and that served us well for over a year. Eventually, parsing the query data was over 80% of the processing time and something needed to be done. From there we briefly dabbled with the Rack::Utils query string parser. While faster, this still wasn’t fast enough. From Rack, in a fit of desperation, we wrote our own. With the new version, written in C using Inline Ruby, we were finally fast enough. The parser has served us well but was starting to show some age with the Inline Ruby. We’ve started using Ruby 1.9 for our new API servers while the old servers were on 1.8. RVM doesn’t deal with Ruby Inline very well and we’d end up having to erase the Inline cache when switching between server versions. To that end, we converted the Inline Ruby version into a Ruby extension and, to help deal with Bundler, decided to push it out the door as a gem. You can check it out on Github or install the gem from gemcutter. At this point, the gem does what we need it too. It doesn’t handle every case, but you should be able to update it, if needed, to handle what you need. require 'query_string_parser' p QueryStringParser.qs_parse("my=query&string=parser") => {"my"=>"query", "string"=>"parser"} include QueryStringParser qs_parse("does[]=array&does[]=data&does[3]=values&does=more") => {"does"=>["array", "data", "values", "more"]} As you can see from the example above individual params will be set their values. If a parameter exists multiple times it will be turned into an array of values. You can set a number into the brackets but it will be ignored by the parser. If you’re parsing a lot of query strings give the parser a try. Let us know what you think. [Less]
Posted about 7 years ago by dj2
Picked up Final Fantasy XIII last week. I’m wishing now I had skipped it. I got sucked in by memories of older Final Fantasy games, I want Final Fantasy games to be good. Childhood memories and all that. Turns out, Final Fantasy XIII wants to turn ... [More] that shit into lies. I have to say, about the only thing the game had going for it was that it was pretty. The problem with that, I’m pretty sure the developers didn’t want people to play the game, they wanted to make a movie. Half the time it feels like you’re just killing time between cut scenes. You’ll have a section where one cut scene will end, you’ll move your character up a set of stairs to another cut scene. I get it, you can make pretty hair and pretty water. Yes, the clothing moves very nicely. Guess what, I want to play a game. I don’t want to watch a movie right now. The game play is rigid as all hell. You can only go where the game wants. Your path is drawn out on the mini map and you have no-say in where you’re going. You can always tell where a hidden orb is, it’s in the small branch that appears in the map and then dead-ends. It’s laughable when the character says “We can just follow the lights”. Yea, no shit, I can’t go anywhere other then following the lights. The story is convoluted and didn’t feel very solid to me (at least to the point in the game where I got.) The conflicts between the characters was more grating then interesting. The magnitude of backstory is almost staggering, but, a great time to put in more cut scenes to fill-in that backstory. One of the things I really liked about the old Final Fantasy games was the turn based combat. You could plan out your battles, pick the moves you want, refresh your memory on spells. It gave you a lot more control. The new time based combat quickly turns into a hit the auto-generated sequence instead of putting thought into what’s going on. But, the straw, the straw that broke the camels back and then kicked him in the face. The timed battles. You have a max amount of time to earn the respect of a boss. I played one of the timed battles about twenty times. I read the walkthroughs on the internet. I played a few more times. Congrats Square Enix, you created a point in the game that I can’t fucking get past. It’s a dead end. The game is now more frustration then fun. At that point, it’s going back to the store to get traded in. I’m going to go play Bioshock 2. To sum it up, if you see Final Fantasy XIII on the shelves, skip it. It isn’t worth the money. It isn’t worth the time. Sorry Square Enix, you’ve destroyed a once loved franchise. Thanks for that. [Less]
Posted over 7 years ago by dj2
I’ve been working on Rife, a Google Reader client, over the last few days and have been digging my way through some more HotCocoa mappings. I figured the best way to remember some of this stuff is to write it down so, the following will look at ... [More] synchronous and asynchronous downloads and writing a XML parser in HotCocoa/MacRuby. So, what are we creating you ask? Well, as I said, I’ve been playing with Google Reader APIs and we’re going to do a synchronous request to Google for an identifier token. Once we’ve successfully authenticated we’re going to make an asynchronous request for our unread items. We’ll then parse the resulting XML document and spit the titles out to the console. As with any HotCocoa application the easiest way to get started is to have system setup the shell of our application. We’ll use the hotcocoa command to create our application which I’m calling titles. titania:example dj2$ hotcocoa titles In order to authenticate to Google we’re going to need your username and password. Since I’m going to do the output to the console for demonstration purposes I’ll use the main application window to show the fields for username and password and a save button. require ‘rubygems’ require ‘hotcocoa’ class Application   include HotCocoa     def start     application(:name => "Titles") do |app|       app.delegate = self       window(:frame => [100, 100, 200, 200], :title => "Titles") do |win|         win.center         win.will_close { exit }         win << label(:text => "Username", :layout => {:start => false})         win << @username_field = text_field(:layout => {:start => false, :expand => [:width]})         win << label(:text => "Password", :layout => {:start => false})         win << @password_field = secure_text_field(:layout => {:start => false, :expand => [:width]})         win << save = button(:title => "save", :layout => {:start => true}) do |button|           button.on_action { authenticate }         end         @username_field.setNextKeyView(@password_field)         @password_field.setNextKeyView(save)         save.setNextKeyView(@username_field)       end     end   end   def authenticate     puts "DO AUTH #{@username_field.to_s} #{@password_field.to_s}"   end end Application.new.start If you run the application by executing macrake in the titles directory you should see the main application window. Typing something into the username and password fields and pressing save you should see something similar to the following in your terminal. DO AUTH test test With our framework setup let’s get to the interesting stuff. First up, authenticating so we can retrieve our identifier from Google. We’re going to make a synchronous request to retrieve the identifier and, if successful, call a method to start retrieving our reading list. def authenticate   username = @username_field.stringValue   password = @password_field.stringValue   query = "https://www.google.com/accounts/ClientLogin?" +           "Email=#{CGI.escape(username)}&Passwd=#{CGI.escape(password.to_s)}" +           "&source=HotCocoaExample&service=reader"   url = NSURL.URLWithString(query)   request = NSMutableURLRequest.requestWithURL(url)   request.addValue("HotCocoaExample", forHTTPHeaderField:"source")   request.addValue("2", forHTTPHeaderField:"GData-Version")   response = Pointer.new("@")   data = NSURLConnection.sendSynchronousRequest(request, returningResponse:response, error:nil)   data = NSString.alloc.initWithData(data, encoding:NSUTF8StringEncoding)   if data =~ /^SID=(.*)\n/     @sid = $1     retrieve_reading_list   else     raise Exception.new("Authentication failed with: #{data}")   end end def retrieve_reading_list   puts @sid end If you add the above to your application, and add require 'cgi', you should be able to run the program, put in your username and password and get a long line of characters spit out on the terminal. Those characters are your Google SID. Let’s look a bit closer at what we’re doing in the authenticate method. We start by grabbing the stringValue for the username and password fields. Then, using these values, we build the query string needed for authentication. This query string is used to build a URL object by calling NSURL.URLWithString(query). With the URL in hand we can start building our request object. This is done by calling NSMutableURLRequest.requestWithURL(url). I’m using the mutable version of the request as I want to add a few extra header values. These are both added with addValue(value, forHTTPHeaderField:field). When we execute our request the system is going to want to put our response object somewhere. In the Cocoa version the method accepts a NSURLResponse **response parameter. In order to handle the response we need to create a Pointer object which is a MacRuby object for handling these pointers to objects. We want our pointer to point to an object so we use Pointer.new("@"). With the response setup we call NSURLConnection.sendSynchronousRequest and provide our request and response objects. I don’t care about the error, but if you do, you’d want to pass in something similar to our response pointer. The request will return a NSData object which we convert to a string using the initWithData initialization method of NSString. With the string in hand we try to extract our SID and, if successful, execute the retrieve_reading_list method which just spits out the SID. OK, cool, we’ve now got our authentication token and are ready to move onto the asynchronous request to get our reading list. def retrieve_reading_list   query = "https://www.google.com/reader/atom/user/-/state/com.google/reading-list?" +           "xt=user/-/state/com.google/read&ck=#{Time.now.to_i * 1000}&n=2"   url = NSURL.URLWithString(query)   request = NSMutableURLRequest.requestWithURL(url)   request.addValue("HotCocoaExample", forHTTPHeaderField:"source")   request.addValue("2", forHTTPHeaderField:"GData-Version")   request.addValue("SID=#{@sid}", forHTTPHeaderField:"Cookie")   NSURLConnection.connectionWithRequest(request, delegate:self) end def connectionDidFinishLoading(conn)   puts NSString.alloc.initWithData(@receivedData, encoding:NSUTF8StringEncoding) end def connection(conn, didReceiveResponse:response)   if response.statusCode != 200     puts "BAD STATUS: #{response.statusCode}"     p response.allHeaderFields   end end def connection(conn, didReceiveData:data)   @receivedData ||= NSMutableData.new   @receivedData.appendData(data) end Similar to the synchronous method we start by building our query string, NSURL and NSMutableURLRequest. We’ve added a cookie to our request object to hold the SID retrieved earlier from Google. We fire the request by calling NSURLConnection.connectionWithRequest(request, delegate:self). We specific ourselves as the delegate for the connection. There are a few delegate methods we can implement to receive data and get notified of request states. These are: connectionDidFinishLoading(connection) connection(connection, didReceiveResponse:response) connection(connection, didReceiveData:data) We’ll look at our implementation of each of these callbacks in turn. First, in connectionDidFinishLoading(conn) we’re just printing out the data retrieved. We need to convert the data, similar to what we did in the synchronous request, from a NSData object to a NSString object. In connection(conn, didReceiveResponse:response) we’re just checking to see if we got a 200 response code from the server. In all other cases we print an error. The main work is done in connection(conn, didReceiveData:data) where we create a NSMutableData object if needed and append any data received into the mutable data object. Running the code at this point should dump the first two items in your reading list to the console. The data will be a big mess of XML but we’ll look at parsing that in the next step. def connectionDidFinishLoading(conn)   xml = HotCocoa.xml_parser(:data => @receivedData)   @receivedData = nil   xml.on_start_document { puts "Starting Parse" }   xml.on_end_document do     HotCocoa.notification(:post => true, :name => "all_entries_loaded", :object => nil, :info => nil)   end   xml.on_parse_error { |err| puts "Parse error #{err.inspect}" }   xml.on_cdata { |cdata| @elem_text += cdata.to_s }   xml.on_characters { |chars| @elem_text += chars.to_s }   xml.on_start_element do |element, namespace, qualified_name, attributes|     @elem_text = ”   end   xml.on_end_element do |element, namespace, qualified_name|     puts @elem_text if element == ‘title’   end   xml.parse end We’re finally getting into some HotCocoa specific code with our XML parser. HotCocoa defines a mapping wrapper around NSXMLParser and provides a set of delegate methods. These delegates mean we don’t have to set our class as the delegate and create a bunch of methods. They mean we can attach our code as blocks on our XML object. All the better if you want to define a few parsers in one class. We start off by creating a HotCocoa.xml_parser. The parser accepts NSData objects so we don’t need to convert our response data to a string. We then setup eight callbacks. There are actually a bunch more callbacks that can be hooked up and you should look at the xml_parser mapping code to see if you need any of them. For our purposes, we only really care about eight. The on_start_document, on_end_document and on_parse_error callbacks, as you can probably guess, get called when we start parsing, when we finish parsing and when we receive a parse error, respectively. We don’t really care about start in this example, but I put it in anyway. When we’ve completed parsing we send a notification and other application code can then listen for this notification and do anything it needs. If we wanted we could store the entries as they’re parsed and provide them to the :object key. This would make those entries available to anyone that receives the notification. If we receive either CDATA, with on_cdata, or text, with on_characters, we append the content to our current elements text. When we receive the open tag of a new element, on_start_element, we dump our current element text as we’ve started a new element. We can also take a look at the elements name, attributes, namespace and qualified name, if desired. Finally, in on_end_element we print out the current element text if the element we’re finishing has a name of title. With all the callbacks configured we use xml.parse to start the parser. You should, if you run this example, see the titles and authors of the first two posts in your reading list. (The author name is also called title and I’m not bothering to check that the parent element is entry before spitting it out.) That’s it. You can now make synchronous and asynchronous requests for content and parse any resulting XML. One last thing before you go. Both of the requests we did above were GET requests. You can do other types of requests using the same methods as above you just need a slightly different setup for the request. You can see a POST request below. request = NSMutableURLRequest.requestWithURL(url) request.addValue(SOURCE, forHTTPHeaderField:"source") request.addValue("2", forHTTPHeaderField:"GData-Version") request.addValue("SID=#{@sid}", forHTTPHeaderField:"Cookie") body="first=1&second=2&third=3" request.setHTTPMethod(‘POST’) request.setValue(‘application/x-www-form-urlencoded’, forHTTPHeaderField:‘Content-Type’) request.setValue(body.length.to_s, forHTTPHeaderField:‘Content-Length’) request.setHTTPBody(body.dataUsingEncoding(NSASCIIStringEncoding)) The first few lines should look familiar from creating our asynchronous request above. Since we’re going to be posting the data we use setHTTPMethod('POST') to setup the request method. We’ve form encoded the data so we set the appropriate Content-Type and set the Content-Length. Note, we convert the length to a string before sending to setValue. Finally, we set the body of the post with setHTTPBody. You need to convert the body string into a NSData object which we do with the dataUsingEncoding method. If you don’t convert the body to NSData you’ll end up sending a nil body with your post request. [Less]
Posted over 7 years ago by dj2
I’m a gamer. My friends and I have spent a lot of time playing various table top RPG games such as Dungeons and Dragons and Mage. The amount we play has tapered off of the last few years as the group spread out, got married, had kids, became adults. ... [More] We do still game, and some of those games have been running for years, other games have fallen by the wayside for various reasons. Recently I’ve found myself thinking about gaming. I’ve never been good at the histrionics part of the game. I can find interesting ways to combine rules, feats and spells but describing the outcome is not a strong point. I’ve always felt awkward describing scenes and actions. I think this is the basis of my not running games. Much to the disappointment of my friends. In the last, 15 I think, years I’ve been gaming I’ve run twice. Once for a group and once as a solo. The group game was too long ago to remember how it went. I was told the solo was good, but it didn’t feel solid to me. Maybe I didn’t have a good enough grasp on where I wanted it to go, maybe it was something else. The lack of description confidence is part of the reason I hang in the background in games. I’ll let the other players take the lead on quests, satisfy their personal agendas. I’ll tag along and do my bit, and sometimes come up with ideas that cause the GM to think, but don’t typically look to take the lead in games. This can, obviously, have a detrimental effect on character development. Especially after a character, played for 2-3 years, dies and you start anew. Developing the skeleton for the new character to hang off, while hanging in the shadows, is difficult. Maybe this is an experience thing. Practice makes perfect and all that. I guess the question becomes, how to you get better at the descriptive aspects, the creative aspects, the design parts of the game? How do you transition from a player in the background to a GM? Or, with smaller increments, a player that steps in the fore more often. I know our main GM would love for more of the descriptive elements in the game. Hell, he’ll give XP rewards for descriptive write-ups of game content. Die modifiers if you give good descriptions of actions. So, I ask you, gentle reader, how do you work on your descriptions, your histrionics, your character and world development? How do you make your game better? [Less]
Posted over 7 years ago by dj2
I’ve wandered back into Objective-C coding land recently. After spending so much time doing Ruby work I’ve gotten used to writing unit tests and using mock objects. To that end, I spent a bit of time figuring out OCUnit (built into XCode) and OCMock. ... [More] I figured I’d write some of this down so I don’t forget for the next time I try to set this all up. We’re going to work with a simple Cocoa Application that connects to an external web resource, in this case, Google Reader authentication. We’ll setup the testing bundle to run when our main application is built and create a mock object so we don’t hit the external resource on every test run. I’m going to be using XCode version 3.2 and create a Cocoa Application called UnitTesting. If you build and run this application you should see an empty window on screen. If you bring up the Build Results window (Build -> Build Results) you’ll see some information about the current build. With our app setup, we’ll start integrating the unit tests. Click on the Targets item in the project folder. Select Add -> New Target. Select Unit Test Bundle and press Next (Note, make sure you’re in the Cocoa and not the Cocoa Touch section when selecting Unit Test Bundle). I named my target Unit Tests and hit Finish. At this point if you right click on the UnitTesting target and select Build “UnitTesting” everything should work correctly. If you right click on Unit Tests and select Build “Unit Tests” you should receive a build failure. Let’s add some tests. The first step is to create a new group to hold our test files. Right click on the UnitTesting item in the project draw and select Add -> New Group. Name the new group Tests. Now, right click on the Tests group and select Add -> New File…. You’ll want to add an Objective-C test case class (Note, make sure you’re in the Cocoa Class section of the new dialog and not the Cocoa Touch Class section or you’ll get an iPhone test class). Name the test GReaderTest.m and de-select the Also create “GReaderTest.h” option. The reason for this is that the headers are usually empty so there is no point in creating the extra file. You’ll also want to make sure you have the Unit Tests target selected. When working with OCUnit you need to name each of your test classes SomethingTest. The trailing Test is required. In a similar vein, each of the tests themselves needs to start with test. So, something like testAuthentication. Since we didn’t create a header file we’ll need to setup the interface for the test in the .m file. #import @interface GReaderTest : SenTestCase @end @implementation GReaderTest @end You should be able to build the Unit Tests target now and have the build succeed. If the build doesn’t succeed, and you see a message about UIKit then you selected the Cocoa Touch unit test bundle instead of the Cocoa unit test bundle. With the unit tests building we can add them into our main build as a dependency. Right click on the UnitTesting target in the project drawer and select Get Info. In the General section add a new Direct Dependency for the Unit Tests target. Now, when you press apple-B to build the project you should see your unit tests executed before the main build phase. Ok, with everything setup we can start testing. First step, each test in this set will be using our GReader object. So, we’ll add setUp and tearDown methods that will be executed before and after each test, respectively. #import #import "GReader.h" @interface GReaderTest : SenTestCase {     GReader *gr; } @end @implementation GReaderTest - (void)setUp {     gr = [[GReader alloc] init]; } - (void)tearDown {     [gr release]; } @end This, of course, won’t execute as we haven’t created our GReader object yet. Let’s do that now. Right click on the Classes group and select Add -> New File…. Add a new Objective-C class which is a subclass of NSObject. Call this new class GReader. The new class should be attached to both our main target and the Unit Tests target. Now, for our little app, the first thing we’ll need to do is authenticate with Google Reader. There is a really good document on the Reader API from the pyrfeed project. We’ll need to post some specific data to a given end point and parse the response. For our unit tests, we don’t actually want to hit the Google endpoint. There is too much time involved and we want our unit tests to be fast. So, we’ll need to do some mocking in order to verify the call is happening, but not actually make the call itself. For this we’ll use OCMock. I’m going to add the test first and then we’ll add the OCMock.framework into the project. First, we need to import OCMock. This is done by adding #import to the beginning of the GReaderTest.m file. The test is defined as follows. - (void)testAuthentication {     id mock = [OCMockObject partialMockForObject:gr];     [[[mock stub] andCall:@selector(fakeAuthenticationPost:)                  onObject:self] post:[OCMArg any]];         [gr authenticateWithUsername:@"dan" password:@"password"]; } - (NSString *)fakeAuthenticationPost:(NSString *)request {     NSArray *d = [request componentsSeparatedByString:@"&"];         STAssertTrue([d containsObject:@"Email=dan"], @"Username not set correclty into request");     STAssertTrue([d containsObject:@"Passwd=password"], @"Password not set correctly in request");         return @"SID=mysid\nLSID=mylsid\nAuth=myauth"; } As you can probably tell, we’ve added two methods. There is only one test, testAuthentication but we needed a helper function to deal with the mock post call. Let’s take a look at what’s going on in these methods. We know that the GReader object will be making a call that we want to mock. So, we need to create a partial mock object that sits around our GReader object. This is done as: id mock = [OCMockObject partialMockForObject:gr];. With the mock created, we can stub out specific methods of the GReader object, this is called Method Swizzling in Objective-C land. In order to swizzle the method we have [[[mock stub] andCall:@selector(fakeAuthenticationPost:) onObject:self] post:[OCMArg any]];. So, we create a new stub on our mock object. We then tell the stub to call the fakeAuthenticationPost defined in the current object whenever the post method is called on the GReader object. Our post method takes one parameter so we specify [OCMArg any] to allow any argument through. Finally, we call the authenticateWithUsername:password: on the GReader object to kick off the authentication. The second method we defined, fakeAuthenticationPost, will be called instead of the post method in the GReader object. To that end, it will receive the same parameter, the NSString that is pass to post. To be on the safe side, I’m verifying that we’re properly passing the required email and password fields in the string. I then fake some return data that is similar to a successful response from Google. If we try to build the project at this point we’re going to get a lot of errors. First, since we haven’t added the OCMock framework and second, we haven’t created the authenticate or post methods for our GReader object. First things first, let’s get the OCMock framework setup. To do that, you’ll need to download OCMock. You can grab the .dmg file off the OCMock pages. When you extract the archive you’ll see the OCMock.framework and the source directories. In order to keep everything in Git, I created a Framework directory in my UnitTesting directory. I then copied the OCMock.framework directory into this new Framework directory. You could also install OCMock in the /Library/Frameworks directory but I like having it in Git as not all the developers may have OCMock installed. Back in XCode, we need to add the framework to our project. Right click on the Frameworks group in the project directory. Select Add -> New Group. Name the new group Testing Frameworks. Note, this isn’t required, I just like the separation it provides. Finally, right click on Testing Frameworks and select Add -> Existing Frameworks... then press the Add Other… button. Navigate to the OCMock.framework directory you just copied into Frameworks directory and press Add. Make sure the new framework is hooked up to the Unit Tests target. Right click on the framework and hit Get Info in the general tab you can verify the targets. Building the project at this point will get some warnings about missing functions and a crash executing the unit tests. In order to fix the crash we need to setup the build to copy the OCMock framework into our build directory. I’m not entirely sure why this is needed, something about one of the paths set in the framework, but it does get things working. Right click on the Unit Tests target and select Add -> New Build Phase -> New Copy Files Build Phase. You need to set the Destination to Absolute Path and the Full Path to $(BUILT_PRODUCTS_DIR). Once the Copy Files phase is created drag the OCMock.framework from the Test Frameworks group into the copy phase. The copy phase needs to be placed between Compile Sources and Link Binary With Library phases. At this point, we should be able to execute our build again and we’ll get a failure [NSProxy doesNotRecognizeSelector:post]. This is good. This means our mock is setup correctly and is trying to hook into our non-existant post method. Let’s go create a couple methods so things compile correctly. First we update the GReader.h file as follows. @interface GReader : NSObject - (void)authenticateWithUsername:(NSString *)username password:(NSString *)password; - (NSString *)post:(NSString *)request; @end And the implementation. #import "GReader.h" @implementation GReader - (void)authenticateWithUsername:(NSString *)username password:(NSString *)password {     [self post:[NSString stringWithFormat:@"Email=%@&Passwd=%@", username, password]]; } - (NSString *)post:(NSString *)request {     return @"My post data"; } @end With that in place our build should succeed. You can make sure that things are working correctly by modifying [gr authenticateWithUsername:@"dan" password:@"password"]; to something similar to [gr authenticateWithUsername:@"stan" password:@"password"]; and you should see a build failure. That’s it. We have the build system setup and the mock objects hooked up. We can now continue down our TDD path. You can see an example project that I’ve uploaded to GitHub. [Less]
Posted over 7 years ago
We were playing with some ideas over at PostRank and realized we needed a way to give out some unique codes. We didn't want some random string of letters and numbers as that isn't very memorable. We wanted real words. We wanted something entertaining. I spent a bit of time looking ...
Posted over 7 years ago by dj2
We were playing with some ideas over at PostRank and realized we needed a way to give out some unique codes. We didn’t want some random string of letters and numbers as that isn’t very memorable. We wanted real words. We wanted something ... [More] entertaining. I spent a bit of time looking around. The closest thing I found to what we wanted was Webster. While Webster would work, I wasn’t a big fan of the words we were getting out of the system in some quick tests. After some more fruitless searching, in a fit of not invented here syndrome, I created my own. Enter, Moniker. Moniker will take a list of descriptive words and a set of animals and give you a string. There are, currently, just over 42 thousand combinations. Enough for what we needed. The system is pretty simplistic and it’s up to you to make sure you aren’t getting duplicates. titania:~ dj2$ irb >> require ‘rubygems’ >> require ‘moniker’ => true >> Moniker.name => "octagon-zebra" >> Moniker.name => "shallow-lion" >> Moniker.name => "concave-parrot" The code is all up on GitHub so take a look and feel free to play. Let me know if you’ve got any ideas for improvements [Less]