Tuesday, January 30, 2007
So I have some catching up to do.
I have a number of things on my mind, I think I'll start with a little bit of a report back on the goings on on the project for the last few months.
When we started on the project, a websphere/RAD based enterprise development with web services and session beans we eschewed the default RAD/WAS approach of hand coding the wsdl's. They were deemed to complex, and to be honest, there is some truth in that. However, like a fellow esteemed developer pointed out (web services duplication language) there is a lot of duplication in a wsdl so once the basic wsdl has been coded it is merely a case of copy paste. Furthermore, if the best practice approach of having one request object which encapsulates all the objects, once the methods have been setup (in the wsdl's) and the xsd are outside of that file, then changing the parmaeters of the method does not require editing of the wsdl. In fact it is very seldom that I find myself editing the wsdl file.
Thus the first round was via a a simplified wsdl where the xml containing the parameters was passed as a string. This string contained the method to be called as well as the parameters of the method. This approach was quite convenient because it meant only one ejb session bean and one web service (wsdl file), thus adding functionality at this level did not necessitate a new ejb/wsdl. Quite nice, but it is not particularly explicit (the outside does not know what is available), and furthermore it was an off standard approach.
Thus effort was made to beak the back of the IBM websphere wsdl generation and web service “topology”. It wasn't as difficult as initially thought. There were a number of teething problems brought on by the fact that the calling context was dotnet and thus had different requirements. Time was spent refining the process and today we have an automated object generation from the wsdl's. What is ironic and should not be surprising is that the whole wsdl to java generation sub-system has nothing to do with RAD, it only has dependencies on WAS. The generation and wsdl tooling of RAD is for want of a better word, crap.
This approach worked well for us. We have ant scripts to generate the dto's from the wsdl and it is nicely integrated into eclipse (notice the emphasis even though we're actually using RAD).
All very well, but the whole time RAD was holding us back in terms of productivity efficiency. The pressure from the business in terms of time frames etc increases all the time, and we found we were just losing too much time due to the idiosyncrasies of RAD.
To give you an example, my RAD platform can no longer refactor! You might ask, what? Yes, when I try and refactor it throws an exception, some yarn about JSF! Wt_? You might ask, what have you done? And I would answer by challenging you to tell me exactly what I could possibly have done. Put it this way, I did not delve into the class files of RAD and change them to specifically make it stop refactoring (there is about 6 gigs of RAD on my hard drive), for what else could I have done to make it stop refactoring? The second question I would ask is how can software possibly let you disable some functionality in some way. It's like driving a car, when you drive a car and it breaks down suddenly you don't exactly ask what did you do to cause it to break down? So it is the wrong question. The very fact that I can cause my IDE to break down in some way is ridiculous.
So this week we embarked on a mission to use Jboss in development. We will still be deploying onto WAS (our build server) that the front end people will use to test against but the DEV will be on eclipse/JBoss. This transition has gone better than I think you'd expect. Jboss is a quality product.
It is hundreds of times faster, more efficient, more light weight – not that it can't mix it with the “big boys”. It is not a monkey and WAS the elephant, it is in fact as nimble as the monkey and as strong as the elephant.
There has been minor differences between WAS and JBoss, this is to be expected. We had a minor issue today with the JAX-RPC mapping differences. I do not foresee a show stopper in this endeavour.
That is an update from my side in terms of what's been going on. See my other post for a programming specific ideas.
When I arrived on my current project one of the first statements that was made is that we need a wiki. I hadn't used one in the past so was a little intrigued by the concept and what it would mean. I felt they were the flavour of the week and felt they would add value a wiki was setup.
It quickly became apparent that a wiki is an essential part of the efficient operation of any software dev team and I think for a number of reasons.
The wiki plays an important role in the collection, storage, categorisation and dissemination of information (mostly technical) related to the project. For any software team to run efficiently information is required that can be shared by the team, and a wiki is probably the most efficient mechanism to do this.
I'm not sure exactly how the word wiki is defined or what makes a wiki a wiki, and not being online right now, I cannot look it up. So I'll give my take on the essential elements of a wiki.
A wiki is an application which serves information efficiently, allows information to be related and structured and allows for easy editing of the information in the same context as it is viewed. Typically they run in a web architecture though it is not fundamentally necessary.
The question then is how has the wiki added value to team and how to you optimise that value.
I think the simple way to show the value of the wiki to a dev team is to look at what would happen if the wiki was removed.
When I compare a previous large project that I was on, which did not have a wiki, with the current project that does, the previous project would not have the problem that they do not have adequate documentation. The documentation they do have is either out of date, fragmented, duplicated in a number of places or irrelevant. The wiki would have removed all those weaknesses. It would have...
Provided a centralised location for all documentation
Provided a tool to categorise and relate various pieces of information together
Allowed for the quick and easy updating of information, keeping it relevant.
Furthermore, in the past, people would often say “Where should I put that information?” - the default answer now is “on the wiki”. It is surprising how often this word comes up in every day conversation.
Once you have been using a wiki for a few months it becomes indispensable.
However, there are number of challenges...
Keeping things relevant – at the beginning of the project a lot of information is placed on the wiki. The rate of discovery of new information is high so the wiki grows quickly. This information has relevance for that time, and because the rate of change in modern IT projects is quite high, information can quickly become dated. Because that information is no longer relevant the motivation to maintain it is low. Take our project for example, there is now a lot of information on the wiki which is irrelevant!
Wiki structure – this is arguably the biggest challenge and I think potentially the most difficult, especially when things change – but getting it right can add a lot of value. The reason I say this is that if the information is appropriately structured, then it can be found more easily, and if it can be found more easily, then people are more willing to use the wiki and thus information is better disseminated.
But I think one of the key issues to remember is that there is no silver bullet to improving quality. A wiki is one of the tools at the developer's disposal. It can't make people record pertinent, nor can is make people read pertinent information. One of the problems with information is that in order for information to be useful it has to be read. It is no use if the only person who benefits from the wiki is the person who originally wrote the information. Having said that, at least the information is now in two places (the developer's head and on the wiki), so the risk involved with that information is reduced.
All the wiki and other development tools can do is make things easier for the developer. They can improve quality if used appropriately and the wiki is an important, I would even say essential tool for improving quality and efficiency in a dev team.
What is interesting is that I do feel that the large corporates have largely missed the boat as far as the software for wiki's is concerned. I do feel that although the quality and maturity of wiki is high, I do think there is still some space for improvement in that space – it's going to interesting to see what kind of innovations materialise.
Trumpi raised the issue on his blog: How does one make wikis work?