16 Jun 2011
A summary of my SPA (Software Practice Advancement) Conference experience.
The session on node.js on Sunday, was my first serious introduction to server side js and node.js.
Start by downloading the source at http://nodejs.org/#download, extract the source do ./configure and make install in the source directory. Takes a few minutes to build. The build works painlessly on OSX and Linux. If you are on OSX install it via brew
http://shapeshed.com/journal/setting-up-nodejs-and-npm-on-mac-osx/
The documentation is at http://nodejs.org/docs/v0.4.8/api/
Node.js is a good way to get into event driven non-blocking programming. It’s easy to do this when you think about doing things (sending responses, rendering) only when things happen.
For example, when data arrives on a socket listening on the server an event is triggered. Code is executed when events are triggered. Instead of polling and waiting for stuff to happen, which can be very in-efficient.
This got me thinking about tiny programs running in a system and only doing things as a result of something being triggered. This could lead to us writing code that is only needed by the system. Code that is not used by the system (i.e not triggered) are culled.
Going on with the js theme, the guys from Caplin Systems, showed how to build applications with js while still testing the full stack. We were shown how to use Eclipse and JSTestDriver. We were also taken thorough building the full application stack using domain and view models in js. While using a Model View View Model pattern with knockout.js to bind the domain to client HTML.
I had mixed feelings about attending this session but in the end it was worth it. Puppet is an open source platform to manage systems, similar to Chef. Puppet and Chef use recipes to build and configure machines. It seems to work smoothly with Ubuntu, using apt get to install and configure the software as specified in a recipe. Still no good Windows support though, which is going to make it hard to use at work.
It is also possible to use Puppet to control/build virtual machines using vagrant. There is also a VMware API and a ruby gem for the API . For further reading on this please follow the links below.
http://www.jedi.be/blog/2009/11/17/controlling-virtual-machines-with-an-API/
http://rubyvmware.rubyforge.org/
Non-Technical Sessions
I enjoyed the non-techy sessions very much. To start it off there was Rachel Davies’ session on building trust within teams. The slides from this are up now http://www.agilexp.com/presentations/SPA-ImprovingTrustInTeams.pdf
Benjamin Mitchell’s session on double loop learning was insightful. It made me think about how much my perception of things don’t necessarily reflect the reality. It is better to seek knowledge than take actions based on assumptions. It became clear how easily we can fall into this trap. There is more reading on double loop learning and the work by Chris Argyris here http://bit.ly/Argyris
Developer anarchy at forward clearly illustrated that to go faster, you need great devs and ditch technology that is slowing down feedback loops. It’s not just about building feedback loops, its how fast you can react to those loops is what matters. On the Internet scale, we’d would have to respond in seconds, minutes and at least in hours. Definitely not in days, sprints or months. In the conversation afterwards, learnt that they spent about 6 months re-building the tools and infrastructure that let them deliver at the speed that the do now.
Comic Communication and Collaboration completed the SPA experience with much hilarity and fun. Think I could try my hand and xkcd style comics. More importantly, the insight learned was ,communicate by talking with peers, or communicate by producing something (i.e readable code, working software). If you have to communicate via email or worse through a 3rd party (i.e project manager) don’t bother. It’s not as effective as you think.
All in all, a very well organised conference, including the invited rants. Looking forward to next year.
Many thanks to those who organised it.
11 Jun 2011
The problem
There are tests (mostly what we call acceptance tests). The system under test (SUT) works with a couple of web services to do its work. The problem I’m faced with is that, when I write these tests, the external web services have to be arranged with data for the test, or the test has to rely on existing data. Writing these tests is extremely painful. Knowledge of the magic existing data is required and in the end what we are really writing are integration tests. But we don’t want integration tests.
At 7digital, we are exposing more of our domain via HTTP APIs, in a “NotSOA” manner. To test each service by itself it becomes necessary to mock the dependencies.
Solutions.
There are a couple of solutions to this.
Set up an stub HTTP web service somewhere, and let it return canned responses. It behaves like the real web service, but only returns responses that have been arranged. The disadvantage of this approach is that I have to know about what canned responses have already been setup.
To change the response for a particular test I have to make changes to the stub server and deploy it, as it is a separate application. It takes the focus away from writing the test I’m concerned with.
Another way is to insert some sort of “switch” in production code that will return canned responses when under test. I don’t like this approach because it requires production code just for tests
My solution.
What I want to do is something similar to setting up mocks/stubs in unit tests, but to do it with an actual http server. To setup the stubbed responses in the test code itself, and not to have to make any change to production code, other than a configuration change.
So this is what I came up with
1: [Test]
2: public void SUT_should_return_stubbed_response() {
3: IStubHttp stubHttp = HttpMockRepository
4: .At("http://localhost:8080/someapp");
5:
6:
7: const string expected = "<xml><>response>Hello World</response></xml>";
8: stubHttp.Stub(x => x.Get("/someendpoint"))
9: .Return(expected)
10: .OK();
11:
12: string result = new SystemUnderTest().GetData();
13:
14: Assert.That(result, Is.EqualTo(expected));
15:
16: }
HttpMockRepoisitory.At creates a HTTP server listening on port 8080, and behaves as if it is process request under the /someapp path. This is the web service that the SUT will get it’s data from.
Using the object returned, it is possible to setup stubbed responses using a fluent syntax. The stub server can return text and files. I’ve posted a few more examples on github http://github.com/hibri/HttpMock/blob/master/src/HttpMock.Integration.Tests/HttpEndPointTests.cs
Kayak.
I’m using Kayak, a light weight, asynchronous HTTP server written in C#. Kayak, can accept request processing delegates, and post them to the HTTP server listening on the given socket. This allows me to add stub responses at runtime.
Current status.
This is very much a work in progress. HTTP GET works. There is support for returning stubbing content types and HTTP return codes. I’ll be able to add to this while changing a very large test suite to not rely on real web services. I’ve created a repository on github at http://github.com/hibri/HttpMock
There are no unit tests now, but I’ll be adding them soon as I wanted to prove the concept first.
Describing this as mocking is not entirely correct, but I couldn’t find a term to describe the concept. It is possible to do the same in Ruby using Sinatra.
28 May 2011
Learning Objective-C has been an interesting experience, and this is how I went about it.
My motivation in learning Obj_C was most of all add another language to my toolkit. I wanted to get behind the mysteries of developing for the iOS.
Found a fairly good set of coursework to get started at http://courses.csail.mit.edu/iphonedev/ . This is a very basic introductory course and the set of presentations guide you through developing a complete iPhone application. Before this I had no clue on how to use XCode. This helped me grasp the basic language concepts. Going through the whole set is highly recommended.
There is a very handy Obj-C tutorial at http://cocoadevcentral.com/d/learn_objectivec/
Setting up tests was frustrating in XCode 3. Although XCode4 has improved on this, it is no where near Eclipse and Visual Studio. Skip the built in test framework (STAssert) in favour of OCHamcrest and you’ll be in familiar territory. There was a bit of hair pulling in figuring out how to get XCode4 to use it.
Now that I’ve figured out XCode4 , I’m going through the iOS development videos at http://developer.apple.com/videos/iphone/ .
10 May 2011
Martin Fowler’s post http://martinfowler.com/bliki/TolerantReader.html mirrors my thoughts on consuming web services.
What is a web service wrapper ?
A wrapper for a web service is a library, helps you deal with said service in the language programming language of your choice. It hides the details of the web service, and saves you the trouble of having to know how to parse XML or JSON. The wrapper gives you first class objects to work with.
Many web service providers provide a wrapper for their services in most programming languages.
Why I don’t like wrappers.
I strongly believe that web services should be simple to use. If you expose a web service via HTTP, your consumers should be able to use any HTTP client to consume the service.
You should be able to use a web service by simply typing the URL for the web service method in the web browser’s address bar and see the result in the browser itself. You should be able to use a command line tool such as curl to call the web service. Using a web service should not be more complicated than this. If you were hardcore even telnet should suffice.
To consume the service in code, the bare minimum a developer should need is a decent HTTP client library and a standard XML/JSON parser. Even a decent string library should suffice to make sense of the HTTP responses. These are pretty much available in the framework provided out of the box with the major programming languages. Of course there are situations where you’ll need more, but then this should be the exception and not the norm.
From the point of view of a web service provider, this simplicity increases adoption of your web service. Consumers don’t have to wait for you publish a new version of the wrapper library in order to start using a new service endpoint. Maintenance of the wrapper library is a non-issue, as you can focus on fixing issues with the service only, and not the wrapper library.
Avoid using wrappers internally.
When building a web service, avoid the temptation to use wrappers in your acceptance and integration tests. Strongly typed wrappers are a bad idea. I’ve seen this first hand when writing tests when building the 7digital API. Don’t even parse responses to strongly typed objects. I’ve forbidden the use of wrappers and strongly typed objects for testing the 7Digital API within my team.
The reason for this is, as a provider, you have to use the service as a consumer would. Wrappers hide the complexity of your own service, and you won’t know how complex the service has become. When you work with bare HTTP response strings, you will see potential usability issues that consumers will face.
Publish sample code, but not wrappers.
If you are providing a web service, my recommendation is to publish sample code, and not wrappers around your service. Show developers how to consume the service in their favorite programming languages. A good idea is to give them tests as Martin Fowler recommends. The tests can serve as sample code. They can run those tests against your service and see where the problem lies.
Thoughts
In my experience, using a strongly typed language such as C# has been a bad idea. Dynamic languages like Ruby can be used to write more tolerant wrappers. This is because with Ruby you can evaluate the API responses at run time rather than having to use an object that requires the response to be in a certain format.
04 May 2011
I was reading http://7enn.com/2011/05/02/the-biggest-time-eater-in-unit-testing/ the other day.
How do we get past the discovery phase without getting stuck ?
My suggestion is to start the tests with what you know and go with it. Think about it as we write the test. It will become clearer gradually.
I’ll explain with an example.
The test name. This is the part where most of us get stuck. It’s an art. Getting it right is solving half the problem, but lets not get stuck on it. It doesn’t matter if we don’t have the perfect test name at the start. Give it a name that is good enough.
[Test]
public void Foo () {
}
This is all we need for now.
We have name, we have a test, our test runner can run the test.
Next. What is the proper outcome to test for ?. In other words what are we testing ?
Start with the assert first. Assume we want the title property in some object set to the expected value. Don’t worry about what type the object is, if you don’t have it already.
[Test]
public void Foo (){
Assert.That(fooObject.Title, Is.EqualTo(expectedTitle));
}
Start with the smallest thing you can assert. Let’s not be concerned with all the other things that need to be tested. In the example code above, I’m testing that an object named fooObject has it’s Title property set to the expected string. At this point I’m not concerned about the exact value of the expectation.
We have an assert. Something needs to happen in the thing we are testing. This is the action. Think about where we get fooObject from.
Some action needs to be executed to get an instance. This action is a method. In code , the way to make something happen is to call a method.
Create a method. Let’s not get stuck on the name too much. I’ll call it Get(). Calling Get() gives me fooObject.
[Test]
public void Foo (){
var fooObject = controller.Get();
Assert.That(fooObject.Title, Is.EqualTo(expectedTitle));
}
We have an assert and an action. The code even compiling yet. Keep typing.
Our object controller needs to be an instance of something. Is this an existing object we are testing ?, if not I’ll create it. We always have a context in which we are writing a unit test. This gives us an indication on what the thing we are testing should be called.
In this example, we are testing a Controller. Instantiate the controller object.
[Test]
public void Foo (){
var controller = new HomeController();
var fooObject = controller.Get();
Assert.That(fooObject.Title, Is.EqualTo(expectedTitle));
}
Next use resharper ( or similar tool for your language) to create the missing objects. Let’s make the code compile at this time. Give a value for the expectation. Don’t worry about which project (in Visual Studio) the classes will be in. It’s fine to keep it in the test class for now.
[Test]
public void Foo (){
string expectedTitle = "some title";
var controller = new HomeController();
var fooObject = controller.Get();
Assert.That(fooObject.Title, Is.EqualTo(expectedTitle));
}
We have an assert, action and arrange. Read it the other way. Arrange, Act and Assert. The 3 parts we need for a test.
Run the test, see it fail. Write the bare minimum of code to make the test pass. Even hardcode things. Get a green bar. Even give temporary names.
Let’s refactor. Rename things. The test is much more clearer now. Rename Foo to ShouldSetTitleOnView.
Run the tests again, and keep renaming things. Move files to where they should be.
At this point we can think about injecting dependencies. Remember those hard coded values we put in to make the test pass ?
Inject a dependency to give those values. There is now a place to inject dependencies, the constructor of the HomeController.
To summarize,
Writing the assert first, gives a focal point for the test. Working backwards from this focal point ensures that we write the bare minimum code needed to make the test pass. We don’t need to worry about injecting dependencies that are not even relevant to what we are asserting.
Avoid thinking about your team/company standards of how you should write your tests. Just write the test with what you know. You can always rename things later when you have a green bar to the appropriate standard.
You can follow the same pattern even in unit tests, integration tests or acceptance tests. By doing this we speed up our discovery phase by writing the tests.