Text
SockJs App on Cowboy with Rebar
## Acquire a rebar binary I chose to clone rebar's git and build it instead of downloading the binary from https://github.com/basho/rebar/wiki/rebar In order to extend my project to have multiple Erlang applications, I put the applications in `apps` folder. To support that structure, I created a simple `rebar.config`: %%-*- mode: erlang -*- {sub_dirs, ["apps/example", "rel"]}. {deps, [ {cowboy, ".*", {git, "git://github.com/extend/cowboy.git", "master"}}, {sockjs, ".*", {git, "git://github.com/sockjs/sockjs-erlang.git", "master"}} ]}. The `sub_dirs` option tells `rebar` where to look for the app. The `deps` option specifies which dependencies to download. In this case, `cowboy`, and `sockjs-erlang`. $ mkdir --parent apps/example $ cd apps/example $ ../../rebar create-app appid=example $ cd ../.. This will create a `src` folder inside `apps\example` with three files: * `example.app.src`: This can be seen as the definition of the application, in which `rebar` will copy over to the `ebin` after compiling. * `example_app.erl`: The main application module * `example_sup.erl`: The supervisor module I also created a `start.sh` script to start the server. ## Structure ├── apps │ ├── example │ │ ├── src │ │ │ ├── example.app.src │ │ │ ├── example_app.erl │ │ │ ├── example_sup.erl ├── echo.html ├── rebar ├── rebar.config ├── start.sh The content of those files are available at [this gist](https://gist.github.com/655869fe29a2709948b9)
2 notes
·
View notes
Link
0 notes
Text
Erlang, Entity System, gen_event. Oh my!
Last night, I was reading about Erlang’s gen_event and thought to myself: ‘Hmm, this is interesting. It looks like a good candidate for an entity system relying on message passing’. So I sat down, read more about it, and coded up some components, and I was pretty happy about the results.
What is Entity System?
This is a really big subject that I would suggest google it for more info. Basically, this is a get away from traditional OOP approach in game development, which is quite messy when you get lots of different types of entity. The Entity System breaks down game object’s (or entity) data and behaviors into components and/or subsystems and the entity is just a container for those components. That way you can mix and max different components to have all kind of interesting entities.
What is gen_event?
Note: I’m an Erlang beginner, so I’m sure I will be wrong, please correct me!
gen_event is a behavior in Erlang, where you will have a process that acts like an event manager. This event manager can have multiple event handlers (which is callback modules) attached to it, and can notify any event to all its handlers. The original example is the error_logger where, on error event, can have handlers to output to console, or to file, etc…
Why gen_event and Entity System?
One of the major specification for Entity System is how to handle communication between entity and its components, or between the components themselves. There are different ways to do it, such as one component keeping reference to other components, or by message passing.
Erlang already provides efficient message passing. gen_event behavior allows us to attach a set of handlers to specific event manager. To the Entity System, we could think of each entity as an event manager process, and each component as an event handler. Each component will handle a specific state of the entity (such as HP, position). Events from the world will let entity notify its components to modify the state of the entity.
Again, I may be wrong, but let’s experiment!
Experiments
Because of the way Erlang works, I cannot have each module as an entity. I need name for each. So instead I created a module to act as an entity manager that will spawns entity (which is an event manager, identified by its Pid). There are some abstract functions to handle gen_event functionality.
Nothing special here, just the basic of gen_event.
Let’s test it out by creating a component for health, called health_component:
-module (health_component). -behaviour(gen_event). -export([init/1, handle_event/2, terminate/2]).
I specified the behavior of this component as gen_event, and export the required functions.
The first function is init/1, which is executed when gen_event:add_handler(Entity, health_component, Args) is called. The Args is passed through. This function must return a tuple {ok, State}.
init(HP) -> {ok, HP}.
So in here, the only state I need for this health component is the HP of the entity. You can specify the amount of HP the entity has by passing it as Args.
The next required function is handle_event/2, which is executed when gen_event:notify(Entity, Event) is called. Event can be anything, and it will be passed to component’s handle_event/2 as the first argument.
handle_event({hit, Damage}, HP) when Damage =< HP -> NewHP = HP - Damage, io:format("Entity got hit with ~p damage and has ~p HP left.~n", [Damage, NewHP]), {ok, NewHP};
The interesting bit in here is that the State of the component is handled automatically, which is not mentioned clearly on Erlang documentation (or I might have skipped it). That way you don’t have to keep how many HP the entity currently has. Again, this is probably obvious for Erlang programmers, but bear with me.
Anyway, I expect an event in format {hit, Damage} when the entity got hit and it still has enough HP. Like I said before, the HP in the arguments reflects the current HP state, and you don’t have to pass it manually.
To make thing more interesting, I have some pattern matching for some more events for this component:
handle_event({hit, _Damage}, 0) -> io:format("Entity is dead already -.-.~n"), {ok, 0}; handle_event({hit, Damage}, _HP) -> io:format("Entity took ~p damage and died.~n", [Damage]), {ok, 0};
And lastly, since all gen_event handlers for a specific event manager will receive the same event, so we need a last clause to ignore all other events.
% The failsafe callback for other events that this component does not care % handle_event(_, State) -> {ok, State}.
The terminate/2 is called when the component is removed from the entity. We don’t do anything special here.
terminate(_Args, _State) -> ok.
Testing
Bonus
I created another component to see if I could expand it. This component handles 2D position for the entity:
A really cool thing with Erlang is hot swapping. With the newly created component, I would just have to compile it, then continue to add the component to the existing entity (without stopping it) and it would just work. Awesome!
Just a quick note, if you don’t have a catch-all handle_event clause in your component, it will crash when other event is sent to it.
I am playing with this more to see how this would work with a full Entity System.
3 notes
·
View notes
Photo
My web life through the browser. Google Chrome does have some nice offline apps. Now only I can find a good web-based editor that is as good as Sublime Text 2. Hopefully Adobe's Brackets will come close.
OS: Archbang
Text Editor: Sublime Text 2
IM: Chat for Google, IMO for the rest
Email: Offline Google Mail
Calendar: Google Calendar
Social: Tweetdeck for Twitter and Facebook
Project Management: Trello
News Reader: Hacker News on RSS Feed Reader
0 notes
Text
Enabling Erlang on a PaaS with the OpenShift DIY App Type
We can hardly found a PaaS that supports Erlang, but yet, with the introduction of OpenShift by RedHat, we could have an Erlang environment on the cloud with OpenShift DIY template.
The first thing you will need to do is get an OpenShift account (it's free).
Since I don't want to install Ruby, I opt for the web dashboard of OpenShift to manage my apps.
First, add a new application and choose Do-It-Yourself Application Type. Choose your app name and your namespace. Let's say erlang-fu.rhcloud.com.
Once done, go back to your dashboard, open the newly-created application erl, you will see the git address for your app such as
ssh://[email protected]/~/git/erlang.git/
Log into your OpenShift application using the SSH credentials:
We will build everything in the OpenShift application tmp directory.
Navigate into the tmp directory:
cd $OPENSHIFT_TMP_DIR
Download the latest Erlang source release:
wget http://www.erlang.org/download/otp_src_R15B01.tar.gz
Extract:
gunzip -c otp_src_R15B01.tar.gz | tar xf -
Note that Erlang requires ncurses, termcap, or termlib, the development headers and libraries, often known as ncurses-devel, which are not installed in your OpenShift app. Therefore, we need to build ncurses also:
wget http://ftp.gnu.org/pub/gnu/ncurses/ncurses-5.9.tar.gz tar -xf ncurses-5.9.tar.gz cd ncurses-5.9 ./configure --prefix=$OPENSHIFT_RUNTIME_DIR make && make install cd ..
However, I could not let Erlang's configure script to detect ncurses, so I install without it instead (by using --without-termcap). In this case, only the old Erlang shell (without any line editing) can be used.
cd otp_src_R15B01 ./configure --prefix=$OPENSHIFT_RUNTIME_DIR --without-termcap make make install
You can now check that Erlang was successfully installed:
$OPENSHIFT_RUNTIME_DIR/bin/erl
You should be dropped into Erlang shell!
You could add Erlang to the PATH
export PATH=$OPENSHIFT_RUNTIME_DIR/bin:$PATH
You can then play around it, or install your favorite web framework (maybe I will install Rebar and Cowboy in another blog post).`
(Inspired by this blog post)
2 notes
·
View notes
Text
Syntax Highlighting for Firebug
Installed [Acebug](http://www.flailingmonkey.com/acebug/) for the Console with Monokai theme, and [FireRainbow](http://firerainbow.binaryage.com/) for the Script tab. Debugging just looks nicer.
2 notes
·
View notes
Text
Dependency Injection with Node.js
I am currently doing test driven development for a project with Node.js. When I was testing a User 'class' that supposes to communicate with a Mongo database, I thought, what if, what if I would not choose Mongo, but another database instead, or, what if I could not get a database at the moment.
I decided to test with a mock repository instead, using some kind of dependency injection. I use Mocha as my test runner and use Chai.js for assertion.
At first, I have a user-tests.coffee that I write test cases for. My UserMemory.coffee acts as an in-memory repository that will implement register and login methods. These methods interact with an array and push or query from that array.
User = require('../libs/UserMemory').User describe "User Repo", -> userRepo = null beforeEach (done) -> userRepo = new User() done() describe "when registering a new user", -> ### test cases here ###
All the tests passed. Now I want to test User with MongoDB. I could have created the same tests and put into a user-mongo-tests.coffee with a UserMongo.coffee. But then we have duplicate codes every where. Smelly!
So, I extracted the tests into a file shared-user-tests.coffee as below:
should = require("chai").should() exports.shouldBehaveLikeAUserRepo = -> describe "when registering a new user", -> ### Test cases here ###
The user-tests.coffee now only has very little code:
UserMemory = require('../libs/UserMemory').User shared_tests = require('./shared-user-tests') describe 'User Memory Repo', -> beforeEach (done) -> @userRepo = new UserMemory() done() shared_tests.shouldBehaveLikeAUserRepo()
Note the @ in front of userRepo? This is a cool feature of Mocha that allows you to store the context in all test cases. You just have to put @ in front of userRepo in all the shared test cases.
Writing a user-mongo-tests.coffee is now pretty simple:
UserMongo = require('../libs/UserMongo').User Mongo = require("../libs/Mongo").Mongo shared_tests = require('./shared-user-tests') describe 'User Mongo Repo', -> dbInfo = name: 'test' url: 'localhost' port: 10015 mongo = new Mongo(dbInfo) before (done) -> mongo.connect => @userRepo = new UserMongo {repo: mongo} done() beforeEach (done) -> users = mongo.getCollection 'users' users.remove -> #clear the database done() shared_tests.shouldBehaveLikeAUserRepo()
libs/Mongo module is a custom written adapter to talk with MongoDB.
Note: Your UserMongo.coffee should be written to query directly to MongoDB instead of in-memory array like UserMemory.coffee
3 notes
·
View notes
Link
0 notes
Text
MongoDB MapReduce
I got my first map reduce function working today with MongoDB. Well, it's not a full map-reduce, but just a group function, but it was a good learning experience.
We have a collection to store all the activities of a user's child (user can have many children) when the child does a quiz or reads a book, etc. We use ActivityStrea.ms format to store the activities.
So one of my tasks is to mark all the quizzes each user's child has completed, using the latest attempt as the score of how many corrects over total. I went an extra mile and added some statistics to the requirements list, such as the number of attempts, the overall total questions, and the overall total corrects.
So I have a list of quiz ids, and the username. I would loop through the quiz ids, find the whole quiz, then use group function to find all the related activities group by the children name, and attach the result to the quiz and return to the browser (or the Backbone Model in my case).
The piece of code to perform the grouping is:
$keyf = new MongoCode(' function(activity) { return {child: activity.actor.displayName}; } '); $initial = array( "attempts" => 0 , "corrects" => 0 , "total" => 0 ); $reduce = 'function (activity, prev) { prev.attempts++; if (typeof(prev.latest) === "undefined" || prev.latest.updated < activity.updated) { prev.latest = activity; } for (var i=0; i<activity.object.questions.length; i++) { prev.total++; var question = activity.object.questions[i]; if (question.options[4] === question.options[0]) { prev.corrects++; } } }'; $condition = array( "object.objectType" => "quiz" , "object.id" => $quiz_id // $quiz_id is the element in the loop , "actor.parent" => $user->email , "updated" => array('$exists' => true) ); // Called after all the reduce calls are completed. $finalize = 'function(out) { out.latest_total = 0; out.latest_corrects = 0; for (var i=0; i<out.latest.object.questions.length; i++) { out.latest_total++; var question = out.latest.object.questions[i]; if (question.options[4] === question.options[0]) { out.latest_corrects++; } } }'; $progress = $activities->group( $keyf, $initial, $reduce, array("condition"=>$condition, 'finalize'=>$finalize) );
$progress will have the output
[ { "child": "Child 1" , "attempts": 1 , "corrects": 9 , "total": 10 , "latest": {/* the latest activity object */} , "latest_corrects": 9 , "latest_total": 10 } , { "child": "Child 2" , "attempts": 3 , "corrects": 27 , "total": 30 , "latest": {/* the latest activity object */} , "latest_corrects": 8 , "latest_total": 10 } ]
0 notes
Text
Config EveryAuth with Express
It turned out that in order to use EveryAuth with Express, you have to config it before configuring Express. Also, make sure to declare EveryAuth middleware after you declare your session, but before declaring the router. everyauth.twitter .consumerKey(conf.twitter.consumerKey) .consumerSecret(conf.twitter.consumerSecret) .findOrCreateUser((session, accessToken, accessTokenSecret, twitterUserMetadata) -> return twitterUserMetadata ) .redirectPath '/' app.engine('dust', cons.dust) app.configure -> app.set 'views', __dirname + '/views' app.set 'view engine', 'dust' app.use express.favicon() app.use express.logger('dev') app.use express.static(__dirname + '/public', {redirect: false}) app.use express.bodyParser() app.use express.methodOverride() app.use express.cookieParser('your secret here') app.use express.session() app.use everyauth.middleware() # everyauth middleware declared here. #everyauth.helpExpress app app.use app.router I'm using Express 3.0alpha, so `everyauth.helpExpress app` does not work, yet.
1 note
·
View note
Text
Using Dust.js with Express3.0alpha on Node.js 0.6.x
Dust.js is a nice template engine for Node.js, but its development has been idle for almost a year.
Now I'm trying to test it on Express with Node.js 0.6.x, and the easiest way turns out to be using consolidate.js.
Consolidate.js only support express.js 3.0.xbranch, which is not onnpm` yet, so we need to install the CLI from master branch on github
$ npm install -g https://github.com/visionmedia/express/tarball/master
Then create a default Express app named test:
$ express -s -e test && cd test
Now we need to install express module also from masterbranch on github:
$ npm install https://github.com/visionmedia/express/tarball/master
Then do regular npm install to install dependencies.
Consolidate.js author has not pushed dust.js support to npm, so we need to install from master branch.
$ npm install https://github.com/visionmedia/consolidate.js/tarball/master
Then install dust.js
$ npm install dust
There are a few changes you need to do before being able to run the app. See the below app.js for changes (old ones are commented out).
var express = require('express') , routes = require('./routes') , http = require('http') , fs = require('fs') , path = require('path') , cons = require('consolidate'); var app = express(); // assign dust engine to .dust files app.engine('dust', cons.dust); app.configure(function(){ app.set('view engine', 'dust'); app.set('views', __dirname + '/views'); app.use(express.favicon()); app.use(express.logger('dev')); //app.use(express.static(__dirname + '/public')); app.use(express.static(__dirname + '/public', {redirect: false})); app.use(express.bodyParser()); app.use(express.methodOverride()); app.use(express.cookieParser('your secret here')); app.use(express.session()); app.use(app.router); }); app.configure('development', function(){ app.use(express.errorHandler()); }); //app.get('/', routes.index); app.get('/', function(req, res){ res.render('index', { title: 'Testing out dust.js server-side rendering' }); }); http.createServer(app).listen(3000); console.log("Express server listening on port 3000");
You also need to modify Dust to add support for Node 0.6.x by modifying node_modules\dust\lib\server.js as following:
- Script = process.binding('evals').Script; + Script = require("vm"); - require.paths.unshift(path.join(__dirname, '..'));
Then you can now run node app.js.
0 notes
Link
My first project for the current company (Bouncing Pixel). It's launched today on March 4, 2012, the National Grammar Day. It definitely has bugs in it, since it was rushed into launch, but I'm proud of all the new stuffs I was able to put in it :)
I joined this project on January 23, 2012 when it was over 6 months old. We have about more than a month to launch it for National Grammar Day.
It was only PHP front-end when I joined. I was allowed to choose between MySQL or MongoDB, and I picked the later. I also modified a lot of codes to use Backbone.js, which has been causing quite a few catch up between me and my teammates (a.k.a my bosses), but we managed to sort out the differences. My most time consuming issue I had was to make Backbone.js talk with PHP over a some-what REST interface (we use pure PHP with no framework).
I'm really thankful that my bosses gave me a chance to learn new technologies while doing work. I get money for doing things that I love. Awesome!
1 note
·
View note
Text
Backbone Save Model with Callbacks
I spent half an hour today trying to figure out why I could not get the callback to run when I'm duing:
model.save({ success: function() {...} , error: function() {...} })
Neither success or error was called, but the database actually got the data.
Turn out that save need the first parameter to indicate the attributes to save. It can be an empty object if you want to save the whole model.
model.save({}, { success: function() {...} , error: function() {...} })
0 notes
Link
A valid concern. Our team has been debating on this issue when working with Socket.IO on Express framework while implementing the lobby system for our game. One way is using REST from Express to handle requests for different room, and the other is just using websocket for everything.
Another problem is with Backbone.js. It relies on a REST interface to sync between client and browser, but with Socket.IO, how would it do then? Personally, I think this is the line that developers like us have to make a choice and choose the most fit for our project. For us, we will stick with Socket.IO
0 notes
Text
Deployment Services for Different Languages
When it comes to deployment your app to the public, there is headache of server configurations, domain, and then scaling. Well, there are bunch of services out there that would help to deploy and manage our apps in such a convenient way. * PHP: www.pagodabox.com * C#: https://appharbor.com/ * Node.js: http://www.nodejitsu.com * Java: http://www.cloudbees.com/ * Python: http://www.pythonanywhere.com I will update as I have more experience using them.
0 notes
Text
Sync Sublime Text between OS's

I am currently developing on both Windows (for various development languages) and ArchLinux (mostly as a Node.js server). I develop using just a text editor called Sublime Text (v2+). Besides its beautiful interface, fast loading, a convenient mini-map, Textmate bundle compatible, and tons of community plugins to extend it much further, it's cross-platform too.
Together with the famous cross-platform Dropbox, I am able to sync all the packages, custom settings, key-bindings and snippets across my two development environments.
Since I develop mostly on Windows (oh, did I mention that Node.js and CoffeeScript can run on Windows too?), I make a symlink from my Sublime's Packages folder to a folder in Dropbox called Sublime using the following cmd command:
mklink /D "D:\My Dropbox\Sublime" "D:\Sublime\Data\Packages"
Of course, you have to substitute the first folder to be your packages folder to store on Dropbox (this folder MUST NOT exist yet), and the second folder as your current Sublime's Packages folder.
Then on my ArchLinux, I need to delete the current Sublime's Packages folder, then issue the following command in terminal:
ln -s ~/Dropbox/Sublime ~/.config/sublime-text-2/Packages
That's it. Both OS now has the same Sublime Text editor. Love it!
`
0 notes