Tuesday, December 15, 2009

Installing WebSphere 6.1 on CentOS 5.3

My colleagues and I have had a very difficult time trying to get IBM WebSphere 6.1 installed in CentOS 5.3. Thanks to the help of Matt White, we finally believe we have figured out a work around.

The Issue
The swing installer does not load after clicking on the Install link within the HTML Launch page. This issue is best documented in the Redhat and IBM communities.
Basically, we would perform a netinstall of CentOS 5.3, unpack was.cd.6100.wasdev.nocharge.linux.ia32.tar.gz into a subdirectory, and run ./launchpad.sh. Firefox would load the install page which contains a link to install WebSphere that brings up a Java Swing Application that performs the actual install. Problem is the Java Swing app never appears. After closing the browser, launchpad complains saying, "No supported Web browser was detected". Also we discovered an error in the following log file /tmp/niflogs/log.txt:

"Process, com.installshield.wizard.StandardWizardListener, err, could not initialize interface swing"

The Solution(s)
We actually never figured out how to solve this issue. We tried different JDKs, different browsers, 64 bit versions, etc. Nothing fixed the problem. Thanks to Matt White though, we did figure out a possible work around.

Open a new terminal and go to the websphere directory were you unpacked the installer. cd into the WAS directory and run: sudo java -jar setup.jar. The Java Swing install app appears and you are now able to install WebSphere. We have had mixed results with it automatically creating a default profile (AppSvr01), so if it does not create a default profile, you can manually create it by running the /opt/IBM/WebSphere/AppServer/bin/ProfileManagement/pmt.sh script.

The other possible solution is to install an older version of CentOS such as 5.1, install WebSphere 6.1, and then upgrade CentOS to 5.3. I tried doing this with 5.2 and the launchpad still failed, but we know of others that we think used 5.1 and it worked.

Monday, November 16, 2009

Preventing Insecure Login pages

If you are a webpage author or use confidential information on the internet (credit card information, social security number, login credentials for paypal or banking), you might think twice the next time you log into your favorite website. I know I started paying more attention after listening to Steve Gibson's Security Now podcast titled "The Fundamentally Broken Browser Model". That title is a bit confusing, but the underlying issue is very serious. That is why I would like to further explain the issue by providing a simple example, then a real world example using facebook, and then discuss a simple solution.

First, what does Mr. Gibson mean by The Fundamentally Broken Browser Model? Well, at this years Black Hat conference, a hacker named Moxie Marlinspike gave a presentation where he talked about how he was able to capture sensitive information at a public WiFi hotspot using open tools he created. Specifically, during a 24-hour period, he intercepted 114 logins to yahoo, 50 logins to gmail, 42 to ticketmaster, 14 to rapidshare, 13 to hotmail, 9 to paypal, 9 to LinkedIn, and 3 to facebook. So how did Moxie do it? He took advantage of a common flaw of most login pages: the login pages themselves are not received by the client over SSL allowing man-in-the middle attacks to change the submit URL.

SSL (Secure Socket Layer) has been the common method for securing HTTP, but has been limited to only sensitive areas of a website for performance reasons. Sites have always protected a user's private information (usernames, passwords, credit card numbers, etc) using SSL, but to date no one has thought about actually securing the login page itself and this is a huge problem as most sites don't encrypt their login pages. Not encrypting login forms leaves it open to modification before it returns to the user.

This attack has 2 basic steps. First, the malicious user needs to be on the same network (LAN) and utilize ARP (Address Resolution Protocol) spoofing techniques to insert himself in-between connections (this is pretty scary considering how many times I have connected to restaurant, hotel, and airport WiFi hotspots). Second, a LAN user has to visit a login page that was received over a non-SSL connection.

Simple Example
To explain the issue better, let me use a simple example (let's leave ARP spoofing out for now). A user visits an e-commerce site http://www.eco.com. Like most sites, eco.com includes a Login link at the top. Clicking on this link takes the user to http://www.eco.com/login, which returns a simple login form in HTML.

<form method="POST" action="https://www.eco.com/login/authenticate">
Username: <input name="username" type="text">
Password: <input name="password" type="password">
<input value="Submit" type="submit">
</form>
The user types in their username and password, clicks the Submit button, and their private information is sent encrypted over SSL. It is sent over SSL because the form's action value is set to an SSL URL (https://www.eco.com/login/authenticate). But there is one big problem. The Login link the user click was non-SSL (http://www.eco.com/login), meaning the response back to client was sent over the network as clear-text and could easily be modified by man-in-the middle attacks using ARP spoofing. A malicious user could change the form's submit URL from https://www.eco.com/login/authenticate to http://www.eco.com/login/authenticate and the user would never know it happened. There are few default clues to indicate this is happening. So a malicious user, changes the submit URL to a non-SSL URL, a user clicks Submit, and their credentials are sent as clear-text over the network.

facebook example
Armed with this new information I wondered how some of my favorite sites handle this situation. No matter what I tried, it looked like gmail, ebay, and paypal where safe and used SSL for their login pages. So that gave me some peace of mind. However, facebook provides us with a perfect bad example.

If your like me, I type facebook in my browser and use the CRTL+ENT keyboard shortcut to fill in the rest. So I end up at http://www.facebook.com. If you are not currently logged into facebook, you are presented with the following home page that includes a form to register or login.


Again, this home page includes a login page that was sent over a non-SSL connection. As we expect, if you look at the source, you can see the HTML form does post securely to the URL https://login.facebook.com/login.php?login_attempt=1. So what do you do if you don't want to fill in a form that was received over a non-SSL connection? Fortunately, facebook also supports SSL for its home page (https://www.facebook.com), it just takes a little awareness and an extra step.

Solution
More sites need to use SSL for their login forms, and not just for when users post their credentials. At a minimum, like facebook, sites should also support SSL and ideally all login forms would automatically be requested over SSL. If you come across a site that initially does not use SSL for the login page, try and use https. If that fails then think about using a VPN solution or even a travellers router.

Saturday, November 7, 2009

Google App Engine with Grails

After I recently got a grails app to work with spring-security, REST, and cache I wanted to try my luck and create a Google App Engine application using grails. Unfortunately, I think it's still pretty early as I ran into several issues early on. However, I see enormous potential once these issues get resolved.

I am using grails v1.1.1, app-engine plugin v0.8.5, with gorm-jpa v0.5. The first issue I ran into was deploying when using grails app-engine deploy. It failed to work subsequent times after I got the WAR initially deployed using /app-sdk/bin/appcfg.sh. It complained about not supplying an email address. Turns out there is a bug related to ant that should get fixed soon. The work around was simple enough; just hit Enter before typing in your email address.

Secondly, I was disappointed to find out that the spring-security (acegi) plugin wasn't compatible in GAE. Based on the compile errors I received locally, it appears spring-security depends on hibernate which is not an option in GAE (you either get JDO or JPA). I chose JPA because according to the documentation, JDO doesn't support any of the GORM dynamic finders methods where as JPA supports most of them. I guess now that I think about it, I'm not suprised acegi didn't work in GAE; just disappointed.

I found a minor issue I submitted concerning the default delete action in generated controllers. Seems the gorm-jpa plugin doesn't wrap the delete method around a [Domain].withTransaction closure which apparently is required (see the save action).

Finally, I was unable to get content negotiation working using the withFormat concept I described here. It worked locally running grails app-engine run, but threw an exception in GAE:

Caused by: java.security.AccessControlException: access denied java.lang.RuntimePermission getClassLoader

My guess is it's related to the Security issues they report on the FAQ:

"The current preview release of the Google AppEngine SDK has a bug that doesn't allow it to run Groovy code when the full permissions restrictions are used. This will be fixed in the next release, but in the meantime the development environment runs without emulating the permissions restrictions of the actual AppEngine environment."
Overall, I enjoyed the easy commands it provides to build, run, and deploy to GAE. The deploy was suprisingly pretty quick. I was able to view the exceptions in GAE easily. I was just prevented from making any real progress. If you want to see what I did get done, which was not much, check it out here http://james-lorenzen.appspot.com

Tuesday, November 3, 2009

Learning with Grails: Security, Extjs, REST, Spring Insight

Unfortunately, security and performance are often postponed early in the development process and are surpassed for new functionality. Reasonable justification usually includes cost and time. A few of the real issues I think are difficulty and lack of experience. I have to admit I am no security or performance expert, and implementing both early on would be time consuming and difficult. I've always heard security done after the fact is never as good if it's done in the beginning. So I wanted to share my experience using the spring-security (acegi) plugin in grails. Not only did I learn a lot about security, but also how to add support for REST Services and integrate Ajax clients using Extjs, how to enable caching, and inspect the performance of my app using Spring Insight that comes with the Springsource tc server developer edition.

Table of Contents


Spring-Security (Acegi)


Building authentication and authorization into your app in the beginning can be difficult, but with the spring-security plugin and grails it's super easy. There are other security plugins for grails, but I decided on spring-security/acegi since our team has contemplated using it on our project several times and it seems to have a pretty good history. This isn't a how to guide on using the security plugin, the documentation is pretty good. However, I do want to share some unexpected things I ran into and how I arrived at some of my conclusions.

When applying the security capabilities, I applied Test-Driven Development (TDD) to test my assumptions and changes. After I installed the plugin, I got started right away creating Users, Roles, and Requestmaps in Bootstrap. Requestmap is the domain model mapping roles to URIs. I prefered this method initially because I wanted to persist this information to the database. First I wanted to lock down the ability to delete a model I called Event. Here is a sample Bootstrap to accomplish it:

In this first test I created an admin role, mapped the admin role to /event/delete, and then added a new user to the admin role. This worked as expected but had a pretty big security hole (see Unexpected observations using spring-security plugin). The issue was unauthorized users were still able to delete events using the edit form. Underneath, grails submits to the index action and the controller handles directing the request to the delete action in the controller bypassing the security created by my Requestmap. I could hide the Delete button on the edit page, but malicous users could still expliot this hole.

So instead of using Requestmap, I annotated my controller actions. This method can properly handle the use case above and deny unauthorized access to the delete action.
I also learned that users have to be associated with a Role, otherwise they are unable to log in (see post). This seemed rather annoying since I was expecting to use the predefined roles: IS_AUTHENTICATED_FULLY, IS_AUTHENTICATED_REMEMBERED, and IS_AUTHENTICATED_ANONYMOUSLY without having to associate all my users with roles. Two suggestions by Burt where to extend a Base model class that supported a default Role for all Users, or extend the GrailsDaoImpl class to remove the role requirement for users.

In summary, it seems best to annotation your controllers instead of using Requestmap and keep in mind that by default all users need to be associated with a role to login into your application.

REST Support for Extjs Ajax clients


Now that I have portions of my app locked down, I wanted to see how this effected Rich Internet Applications (RIAs) such as those that use Extjs. I wanted to answer 2 basic questions:
  1. How can javascript clients remove admin functions like Delete buttons?
  2. How could I modify my controller to support multiple clients that need HTML, JSON, or XML?
To test these questions, I downloaded and installed extjs into my grails app and created a simple grid based on the array-grid example in extjs. As part of this test I wanted to add a Delete button to the bottom toolbar, so that I could later enable or disable based on the users role.

I'm not super proud of how I contrived the roles and used them in javascript, but below is how I did it (If you were doing this for real, I'd make this a point of focus to come up with a better solution like creating a service that returned this information as JSON that could be called by an Ajax client).

Since there is no real easy way to get access to the users roles in a gsp, I set a bean in the controller that is used in the gsp. So in EventController I added the following grid action:
Then in a new grid.gsp I added the following:

Ext.onReady(function() {
var roles = new Ext.util.MixedCollection();
<% authorities?.each { %>
roles.add("<%=it.authority%>");
<% } %>
});
Then when I build the grid, I could eanble or disable the Delete button by doing:

roles.contains("ROLE_ADMIN")
That's pretty much how I disabled certain gui items that are restricted based on role. Again not very pretty, but effective.

Next, I wanted to populate my grid with real data from my controller. Fortunately for developers, grails excels in this area using convention over configuration using the withFormat content negiotation with URI Extensions. All I had to do was modify the controllers list action to support more than HTML responses:

def list = {
params.max = Math.min( params.max ? params.max.toInteger() : 10, 100)
def events = Event.list(params)
withFormat {
html { [eventInstanceList: events, eventInstanceTotal: Event.count()] }
xml { render events as XML }
json { render( [list: events] as JSON ) }
}
}
Here I easily add support for XML and JSON clients while also continuing to support clients who want HTML. One thing to note, is I like naming arrays in JSON; that is why you see the render([list: events]) syntax. Without that, the JSON response looks like this
{[{"class":"Event".....
instead of my preferred way
{"list":[{"class":"Event".....
Now all the client needs to do is request the URI /event/list.json or /event/list.xml.

Here is my complete javascript that includes the example JsonStore and Grid definitions.

Spring Insight and Caching


Next, I really wanted to try out the Spring Insight capability that is now included in the Springsource tc server developers edition. This would let you see into how grails and hibernate are operating on your behalf per request. For many web applications, there are typically 2 bottlenecks that degrade performance: 1) number of trips from the client to server 2) number of trips to the database. Using the Spring Insight tool, developers can see what SQL is being executed and how long it took.

To get started with Spring Insight I followed their Getting Started Guide. Once I had Insight running (http://localhost:8080/insight), I next ran 'grails war' to create a WAR that I could deploy to tomcat (Note: if you do this multiple times, redeploy, you have to be careful with what you do in your apps Bootstrap class. I was doing a lot of inserts that caused my WAR to fail deployment because by default when running 'grails war', grails defaults to the production environment which uses an update file database. So my Bootstrap was trying to add roles and users that already existed and that caused deployment to fail).

The WAR deployed fine and I was up and running watching my apps performance metrics in the Insight dashboard as I moved around in my app. What I noticed first was unexpected multiple JDBC calls when I thought caching was enabled by default in grails.


What I learned was you have to enable caching on a per domain or query basis. So I added the following to my Event domain model

static mapping = {
cache true
}
rebuilt and redeployed the war and was able to see fewer JDBC calls proving that my cached domain was working. I then added a new Event, viewed all events, and saw the extra select call since the cache was purged. Pretty impressive!

Next I wondered if spring-security was caching all the user and role information by default like it advertised. If not that could be a huge performance issue. Spring Insight was also able to prove that it was caching its results.

Overall, I learned a lot about grails, security, REST, extjs, caching, and spring insight and grails was the perfect platform to prototype these concepts in preparation for real production use.

Tuesday, August 4, 2009

Blank screen when installing Ubuntu


Several co-workers and I run encrypted Ubuntu Desktop on our work laptops. Specifically I have an HP Compaq 6710b and they have all had monitor issues when installing Ubuntu. If you run the default "Install Ubuntu" option, the next screen will just be a blank screen. I need to get this documented because I always forget how to fix it.

While at the Ubuntu Splash screen do the following:

  • Press the F1 key for "Help"
  • Press the F6 key for "Special boot parameters for special machines"
There you will see an option in the middle described as:
Laptops with screen display problems vga=771. We need to append this to the Boot Options.
  • Press the ESC key to get back to the Splash page
  • Press F6 for "Other Options" and then Press ESC
Now the Boot Options should be visible. It should look something like this:

Boot Options file=/cdrom/preseed/ubuntu.see initrd=/install/initrd.gz quiest --

Delete the trailing -- and add in vga=771. You should have the following:

Boot Options file=/cdrom/preseed/ubuntu.see initrd=/install/initrd.gz quiest vga-771

Press the enter key and you should now be able to successfully install Ubuntu.

Thursday, July 23, 2009

Linux config files available on github

I pushed several of my common linux config files that I often find myself sharing with others via email to github. I'd say it's currently specific to Java developers that are using maven2. I lean heavily on alias's defined in my .bashrc file which lives under my home directory. Hopefully others will fork this project and I can merge forked changes back into my project.

Not only did I want to provide an easy way to share among co-workers, but I also wanted to be able to port to other machines or even when I need to rebuild my work laptop. Also, I just want to get more comfortable with a DVCS workflow. Thanks to Ron Alleva's post (Using Git and Subversion in 5 Easy Steps) about getting setup with using git as the client to svn, I am all set at work to start using git.

Please check it out at https://github.com/jlorenzen/linux-config-files/tree

git clone git://github.com/jlorenzen/linux-config-files.git

Currently it includes:

  1. .bashrc - Contains my alias's
  2. .synergy.config - Config file I stole from Joe Kueser which lets me control a windows machine with the keyboard and mouse connected to my linux laptop (see synergy).
  3. .vimrc - Simple vim settings. Nothing special yet
  4. idea.sh - Simple IntelliJ Idea startup script for linux
  5. idea.vmoptions - Modified vmoptions for IntelliJ Idea
  6. start-hudson.sh - Example of a generic linux startup script
  7. stop-hudson.sh - Example of a generic linux stop script
For those looking to create a github account (and for my future reference) I struggled on how to sync up my local master with the master on github. Thankfully history was around to remind me.

git push origin master

This was after I did the one step (I think) to add the origin:

git remote add origin git@github.com:jlorenzen/linux-config-files.git

Wednesday, July 22, 2009

Minor issue with Ubuntu Desktop

Linux and Ubuntu are awesome. I switched from developing on Windows XP to Ubuntu Desktop in late 2007 (7.04 or Feisty Fawn). I will never again develop on an windows machine. Hopefully I won't ever work for a company that requires a windows platform, but at that point I would rather buy my own work computer or setup Ubuntu in a virtual instance. I really appreciate Rob Madole and Brian O'Neill for showing my the right path. I remember Brian's frustration anytime he was forced to pair with me on my windows machine. Now I am like that with my co-workers.

Even though Ubuntu is great, I still have one minor compliant. Well it's really 3 things combined: Extended Monitor support, Compiz, and Gnome-Do. I am addicted to all 3, but trying to get all 3 to work together seems impossible.

Extended Monitor Support
The big problem here is going back and forth between work and home. At work I want to use my monitor, but at home I use just the laptop monitor. In the past I ran nvidia-settings when I wanted go back and forth. This has been improved with some help from Ron Alleva by using the xrandr command. He gave me some alias's to run when I wanted to switch. When I first installed 8.10, I used those alias's, but it seems lately its being auto-detected because when I plug my monitor in at work, ubuntu recognizes it and I don't have to run anything. Nor do I have to run anything now when I get home. So it would seem I have a good handle on this now, but when using an extended monitor other things don't work.

Compiz
I love compiz mainly for one thing: transparency in my terminal. I'm sure I have set a million other settings before, but enabling the Normal Visual Effects under the System > Preferences > Appearance setting is usually one of the first things I do. I believe this setting uses Compiz. With this setting enabled, my terminal window is transparent enough that I can see other applications in the background, like firefox or pidgin which I typically have behind terminal without having to switch back and forth. Unfortunately, it seems I can't have this setting enabled when I have an extended monitor.

Gnome-Do
Man I love it. I am hoping to eventually get to a point to where I have an entirely clean desktop like this. My favourite is the docky option seen here which is similar to Mac's Spotlight. Gnome-Do works just fine with an extended monitor, but docky requires compiz to be running in Normal mode and if you recall compiz doesn't work for me when using an extra monitor.

This is probably a pretty minor compliant verses all the ones I had against Windows, so I don't expect any new patches coming out of Canonical. In an ideal world, my extra monitor would just be automatically detected, compiz would work, and therefore gnome-do docky would work.

Friday, June 19, 2009

Maven Global Excludes

To my knowledge, maven2 currently does not have the ability to globally exclude dependencies. Instead there is the tedius way of excluding a transitive dependency inline with the direct dependency (see Conflict Resolution) For complex multi-module projects, this can be difficult to manage and having the ability to exclude a dependency globally could be very useful. Seems like others share the same feelings (MNG-3196). Unfortunately, for maven2 users this is targeted for maven3. So until then, here is a tip on how to globally exclude dependencies in your project (provided by my co-worker Ron Alleva).

To globally exclude a dependency all you need to do is set the dependencies scope value to provided. This supports excluding transitive dependencies, which is really what you want.

So for example, let's assume I have a WAR project that depends on commons-logging-1.1, which according to "mvn dependency:tree" has a transitive dependency on avalon-framework-4.1.3.

[INFO] +- commons-logging:commons-logging:jar:1.1:compile
[INFO] |  +- logkit:logkit:jar:1.0.1:compile
[INFO] |  \- avalon-framework:avalon-framework:jar:4.1.3:compile
Assuming I want to exclude avalon-framework from my WAR, I would add the following to my projects POM with a scope of provided. This works across all transitive dependencies and allows you to specify it once.
<dependencies>
  <dependency>
      <artifactId>avalon-framework</artifactId>
      <groupId>avalon-framework</groupId>
      <version>4.1.3</version>
      <scope>provided</scope>
  </dependency>
</dependencies>
This even works when specifying it in the parent POM, which would prevent projects from having to declare this in all child POMs.

Monday, May 18, 2009

Ubuntu, Oracle XE, and SQLPLUS

In the past for local development, I have used MS SQL Server running in a VMware windows instance, but that got to be too burdensome and consumed to much of my 2GB of RAM (who would have thought that 10 years ago when I was playing Star Craft on a desktop with 32MB of RAM). Anyways, on a recent business trip with a co-worker (Matt White) who also runs ubuntu, he brought to my attention Oracle XE and how easy it was to install via apt-get and how small a footprint it was considering it's a database and it's Oracle.

I have been very impressed so far and would highly recommend it for linux users wanting a local database. Again, not only is it easy to install via apt-get once you add the repos, but I really don't notice it consuming too many resources.

Two hints I would like to self document more than anything is after installation the name of the SID is XE. You don't specify it during installation, but that is what it is. So my connection string in jboss looks like this:

jdbc:oracle:thin:@localhost:1521:XE

The second hint I wanted to self document is how to get sqlplus working. Personally, I'm often times too lazy to write straight SQL to manipulate data manually. Not only that but it's not a real good use of my time. But after this weekend I got familar with it again due to a lack of a good GUI tool like Oracle SQL Developer at one of our production sites. I was basically forced to use sqlplus to change a few values. The one huge benefit it has, is you don't have to wait on some slow GUI tool to load. So locally I now have, Oracle SQL Developer for extended use, got the SQL Query Plugin in Idea to use when writing code, and now sqlplus. So when I am impatient and I don't have Idea up, I plan on using sqlplus.

It doesn't work right out of the box. You have to set ORACLE_HOME and add it's bin directory in PATH and also set the ORACLE_SID.

I added the following to my home's .bashrc file:

export ORACLE_HOME=/usr/lib/oracle/xe/app/oracle/product/10.2.0/server
export ORACLE_SID=XE
export PATH=$ORACLE_HOME/bin:$PATH

After that reload the .bashrc file by running . .bashrc and then run sqlplus.

Saturday, April 18, 2009

Mock Testing with Groovy

Mock classes enable developers to quickly write unit tests that would otherwise require integration tests because of the need for a database, web container, or servlet container. Using mock classes helps to test a class in isolation and enables rapid feedback. It's not ideal to have a project with only integration tests and no unit tests. Mock classes enable unit testing that otherwise would be impossible.

So how does one create a mock class? Well, there definitely is not a shortage of mock frameworks: EasyMock, jMock, Gmock, MockFor and StubFor. You can always just create your own mock class in your test suite (which I have done in the past when in a pinch). But in my opinion these solutions lack one thing: the ability to quickly create a simple mock that when called returns what I want. To many of the mock frameworks force you to jump through hoops and call methods like expect(), replay(), verify(). What I want is the ability to define a mock class in a single line and inject it myself.

I thought MockFor and StubFor would be the solution, but the documentation is lacking and I haven't figured out how to make it work for me. Ideally I would like to say something like:

def mock = new MockFor(ICarDao.class) {
getCar: {return new Car(color: "blue")}
}
Then MockFor would mock out the remaining methods of ICarDao and now I have a mock class that implements the getCars method that when called by the Class Under Test (CUT) will return a single Car model. But MockFor doesn't work like this and neither do any of the mock frameworks to my knowledge.

There is hope however. Below you can read about 2 alternatives: groovy's metaClass and as keyword. Both require the use of groovy in your tests. If you haven't switched to using groovy to write tests yet, even for Java, then it's time to start now. There is no other framework or library that can make you more productive when writing tests. It's an instant boost.

Groovy's metaClass
As seen in this example, groovy's meta programming is very powerful. In that post I show how one can essentially mock out Thread.startDaemon() by using Thread.metaClass.static.startDaemon. Groovy's meta programming is very powerful as seen by it's heavy use in grails to make things simple. But it doesn't work in all cases.

Groovy's as keyword
Using metaClass is by far the easiest and my favorite way to create a mock class. However, this didn't work for me in my recent attempt to write some unit tests for a Java Manager class that used spring to inject a DAO that the manager used. It didn't work I believe because my Manager class never created the concrete DAO. It defines some getters and setters and expects spring to inject the concrete class. Because of this metaClass didn't work (bummer). So I did a lot of research to come up with a competitive alternative: groovy's as keyword.

Let's start by defining the Manager class:
public class CarManager {
private ICarDao dao;

public void startCar() {
Car car = dao.getCar();
.......
}

public CarManager setCarDao(ICarDao dao) {
this.dao = dao;
return this;
}
}
Now to test this using mock classes and the as keyword all you need to do is this:
class CarManagerTest extends GroovyTestCase {
def void test_start_car() {
ICarDao mock = [
getCar: {return new Car(color: "blue")}
] as ICarDao;

def cut = new CarManager().setCarDao(mock);
}
}

This uses a map and the as keyword to implement an interface. Here the key is the name of the method to mock and the value is a closure of what you want returned when called. And there is no need to define all the methods of the interface, just the ones you want to mock out.

To me, metaClass and the as keyword are much cleaner and simpler compared to the current mock frameworks. At least for this type of testing. Those frameworks might be perfectly useful for other types of testing, I just haven't ran into them yet.

Monday, April 13, 2009

Better Offline Capabilities with Maven 2.0.10

This week while traveling on business, I had hoped to get a lot of work done, but was quickly disappointed when I wasn't able to build because maven couldn't download the latest SNAPSHOTs. Even though I had fresh local SNAPSHOT versions that would suffice.

Fortunately, maven 2.0.10 was recently released and it promised fixes for this exact situation (see release notes). Currently I am happily using version 2.0.9.

So now that I am at the hotel I decided to see if an upgrade would help me. I first built the submodule again to make sure I got the dreaded unable to download dependency. Downloaded and installed 2.0.10 and rebuilt again. And I am very excited to report that it worked

To stay tuned into everything Maven, subscribe to Brian's Enterprise Blog at Sonatype. Brian Fox is one of the head developers for Maven and according to Google Reader I read 100% of his posts.

Friday, March 13, 2009

grails create-app esb

I know it seems like a strange app to create with grails, especially when there a several other capable opensource ESBs available (mule, servicemix, openesb), but instead of asking yourself why, ask yourself why not. While it may not offer all the same features (BPEL) it's certainly a possibility for certain circumstances. This possibility was spawned together with co-worker Kit Plummer when discussing different options for an upcoming story that required email integration. At first it seemed kind of ridiculous, especially since we were already using openesb in jboss, but it started to make some sense the more we thought about it.

So how could a MVC web framework possibly replace a feature complete ESB? Well, first let me explain my background. I don't consider myself an ESB expert, but I do have some experience with ServiceMix and OpenESB (see openesb topics). In fact, my former company, let our team develop and open source 4 JBI Binding Components for RSS, SIP, UDDI, and XMPP.

Here was a short list of complaints I had with running OpenESB v2 in jboss:

  1. At first it was pretty simple to setup, install, and run for a single developer, but trying to duplicate that across a large distributed team and things get more complicated. This included the difficulty of setting it up in all of our CI and beta environments. It's not as easy as just running Glassfish which includes OpenESB.
  2. OpenESB v2 basically required Netbeans, which again isn't too hard for one developer. But asking your team to run a second unfamiliar IDE is no easy task. The OSGi based OpenESB v3 does not require Netbeans, but it does make it easier.
  3. Composite Applications are less than easy to create, test, maintain, version, deploy in CI, etc. At least not compared to a Grails WAR anyways. Being able to consistently do those 5 things over 12-24 months is really bigger than you think.
  4. Security. It's more difficult to lock it down compared to a WAR running in jboss fronted by apache.
Here are some advantages we saw in treating Grails like an ESB:
  1. Easy. Simple. Trivial for everything including: developing, maintaining, testing, deploying, versioning, securing, and installing.
  2. Grails is plugin based and has a growing number of good plugins. One of the main benefits of an ESB is leveraging all the other work so you don't have to write anything. Things like HTTP, JMS, SMTP, JDBC, RSS, XMPP, FTP, BPEL, XSLT, and FILE just to name a few. Granted many Grails plugins are web focused, but there are several similar capabilities such as HTTP, JMS, JDBC, SMTP, RSS, and Workflow. Beyond that writing your own Grails plugin is easy compared to writing your own ESB component. See the Mail plugin as an example of how easy it is to send an email in Grails.
  3. Doesn't require Netbeans. Developers can continue using their favorite IDE.
Despite all of that, I do think the case can be made better for OpenESB if your team is already using Glassfish+OpenESB (or GlassfishESB) and Netbeans. But it does make it much more difficult if your not. And I know that ServiceMix v3 was deployable as a WAR, but that was not it's default behavior. Not sure about the OSGi based v4, but I can't imagine they stopped supporting WAR deployment. Of the two I think ServiceMix reminds me more of a Grails app as far as simplicity is concerned.

There is one big disadvantage to using Grails like an ESB: fewer incoming protocols. I could be wrong on this one but with grails your probably limited HTTP and maybe JMS (outside of setting up quartz jobs and polling). But with an ESB its really unlimited (HTTP, JMS, JDBC, SMTP, SIP, XMPP).

I am sure I am missing several other key pieces, so interested in hearing from others. The nice thing is our implementation is hidden behind a REST API that could easily be supported by a bloated ESB.

Thursday, February 26, 2009

Find when a branch was created in svn

If you merge between branches and HEAD in subversion, you most likely need to know at what revision the branch was created at. Assuming you don't have the luxury of the new merging features in subversion 1.5, here is a trick I learned from the svn docs.

svn log --stop-on-copy http://server/svn/myapp/branches/myapp-1.0

This will stop once it hits the revision the branch was created at, verses continuing on until r1. Previously I would log the entire branch, and do a grep for the comments I inserted when I created the branch. Not ideal but it worked. Now I use the --stop-on-copy option and I know real quick the revision the branch was created at. Giving me the revision I need to use in the merge.

svn merge -r 546:767 http://server/svn/myapp/trunk

Can't wait until we upgrade to subversion 1.5 or a DVCS.

Tuesday, February 3, 2009

Security news for work and personal

Security Now is a popular podcast that discusses important issues related to personal computer security. It's led by TWiT (This Week in Tech) producer Leo Laporte and SpinRite creater and security expert Steve Gibson. However, it's not only for geeks like myself but very understandable by non-geeks. Each week Steve does an excellent job of explaining complicated concepts in layman terms. It's really good in that each topic applies to not only my work but also home computer security. While on my trip to Korea, I was able to get caught up on the past few episodes and consequently want to share with the Lorenzen Nation the things I have learned.

For example, one of the first things I learned when first listening to SN (Security Now) was how WEP was broken. I know I know, pretty pathetic, but I actually never knew this. My guess, or hope, is someone reading this actually didn't know it either. So if you are running a WEP wireless network at home or your business, and assuming you want that network inaccessible by outsiders, like your neighbors, then consider switching to a WPA network. Apparently joining a WEP enabled network is about as easy as joining an unsecured network.

Secondly, I learned a lot about changes I could make to my home network to make it more safe and secure. Specifically, this weekend I switched my networks DNS to use OpenDNS. The main reason was for its parental controls capability. Before I made the change I was easily able to visit an adult content site. Once I switched to OpenDNS and selected the filtering ability, I was no longer able to visit these sites. Now I feel a lot more comfortable knowing that when my kids are on the computer, these sites aren't going to pop up.

Next, SN devoted several episodes to DropMyRights and Sandboxie. Both are used to help reduce windows from getting infected by malware. I would recommend any Windows user at least using DropMyRights. It's free, easy to install, and easy to use. I started using it this weekend and it's worked perfectly so far. DropMyRights was created by a Microsoft employee who wanted to login as an admin, as we all do, but still run certain applications with restricted rights. Why is that important? Well among the many things malware does, all of which require admin rights, are:

  1. Creating files in the system32 directory.
  2. Terminating various processes.
  3. Disabling the Windows Firewall.
  4. Downloading and writing files to the system32 directory.
  5. Deletes registry values in HKLM.
All of this stuff fails if the user is not an administrator. But developers hate running as a non-admin, so the solution is install DropMyRights or not run winblowz. Then to run Firefox you run something like the following: "C:\Program Files\DropMyRights\DropMyRights.exe" "C:\Program Files\Mozilla Firefox\firefox.exe".

Sandboxie, is another neat security application that lets you run any application or drive basically in a separate sandbox. So when you run Firefox in sandboxie, if malware gets installed, it's installed in the sandbox and not your OS. It's very flexible; a caller even said they ran a thumbdrive in a sandbox. This one costs money so I have yet to install it.

Finally, I learned that if you can afford over 200 PS3's and are incredibly smart about cryptography you can crack the md5 hash and create your own valid fraudulent certificate. Over the past few years, researchers have gradually weakened md5, but have finally basically broken it to where no one should be using it now in certificates. This one applies at work and home. Not only should I not be creating certificates at work using md5, but at home I should not visit sites over https that use certificates that use md5. See the resource notes for episode #177 for further information (under Breaking SSL by Spoofing a Certificate Authority). I found several trusted certificate authorities defined on my home computer that use md5. Even a few expired certificates using md2, which I guess malware could change your system time if you weren't running DropMyRights. If you want to see this in action, set your system time to August 15th, 2004 and then visit this site https://i.broke.the.internet.and.all.i.got.was.this.t-shirt.phreedom.org and check out the certificate. This will setup a secure connection using a fraudulent cerificate.

In summary, I have learned a great deal about work and home computer security by listening to the Security Now podcast. Even if I didn't understand all of the details, it has definitely made me a more aware user security-wise. Do as I did and get rid of your WEP network, switch to OpenDNS, install DropMyRights on Windows, and subscribe to the Security Now podcast (very easy in iTunes).

Saturday, January 17, 2009

Testing REST Services with Groovy

For awhile now, RIAs (Rich Internet Applications) have rapidly started replacing traditional server-side web applications (JSP, JSF, etc). Typically, these flashier sites are created using Flex or javascript libraries like extjs or yui. At the heart of these sexy applications live simple REST services that return JSON or XML.

Testing these REST services should be made a high priority for several reasons:

  1. It's the contract between the client and server. Breaking that contract should be avoided.
  2. Your application may not be the only client. Once external parties start consuming your REST services you'll want tests in place to ensure you don't break their clients.
  3. To validate the response is well-formed XML or properly constructed JSON
  4. Valuable black-box test, testing the entire server-side stack starting from the REST layer going all the way down to the DAO layer to the database.
So, what's the easiest way to test REST services? For awhile now I have been combing the internet for the best tools to accomplish this goal, since I wasn't going to do it in pure Java, and I think I finally found the right combination using Groovy and HttpBuilder. Groovy because its super easy to parse XML and JSON, and HttpBuilder because it's a great wrapper for the popular Apache Commons HTTP Client library.

Now let's say you have a REST service at the URL http://localhost:8080/myapp/api/books that returns this JSON:
{"books":[
{"name":"Being Mad","author":"Kit Plummer"},
{"name":"Clown Life","author":"Josh Hoover"},
{"name":"You Miss Me","author":"Ron Alleva"},
{"name":"Talk to me Goose","author":"Jeff Black"}
]}
This is how simple it is to write a test in Groovy using HttpBuilder:
import groovyx.net.http.HTTPBuilder
import groovyx.net.http.Method
import static groovyx.net.http.ContentType.JSON

class BooksTest extends GroovyTestCase {
def void test_get_all_books() {
def http = new HTTPBuilder("http://localhost:8080")

http.request(Method.valueOf("GET"), JSON) {
url.path = '/myapp/api/books'

response.success = {resp, json ->
json.books.each { book ->
assert book.name != ""
}
}
}
}
}

To me the major advantage to this approach is being able to traverse the JSON response like I would in javascript. If the response was XML it be just as easy too.

The only two remaining items you would need would be adding the gmaven plugin to your pom and httpbuilder as a dependency.