AJAX, FLEX, GWT and Google GEARS, Ruby On Rails, MS Silverlight, JavaFX? - Decisions, Decisions.
Thu, 30 Aug 2007 20:12:31 +1000
By: gosh'at'DigitalFriend.org (Steve Goschnick)
I have this feeling that we're in for a big change in the way we write applications. Its like the feeling that was in the air in the early 1990s before the www burst on the scene and changed the way that we all use computers. There is a certain buzz in the air, a certain disquiet amongst developers, web developers and information architects, lots of statements of self-reassurance that the developer's existing skill-set will go on being useful into the foreseeable future. Suddenly we have this rush of technologies all begging us to give the Internet a 'Rich Internet Application' interface. We have FLEX from Adobe, GWT and Gears from Google, Silverlight for Microsoft, JavaFX from SUN, and AJAX going in all sorts of directions and probably the cause for all the others. But AJAX is not the only technology that set this particular language fest in motion - the other interesting technology from the perspective of the web was Ruby on Rails.
There are a few questions we might ask ourselves: why weren't language vendors offering us a RIA interface some long time ago, since it has been long possible? Why are they offering them only now? Within the browser, we've all put up with HTML V4's lack of interactivity, in order for our web pages to be reached, read and interacted with by the hundreds of millions of users/viewers. Has the time come for people to break that mould? Can we retain the mould for some applications, but break it for others, while claiming consistency? Can we change the mould entirely? i.e. Does a 3D environment such as Second Life herald a new Internet, such that the HTML interface to the Internet as we know it will be obsoleted real-soon-now? As FTP is to HTTP, will HTML/XHTML become that to a 3D Internet?
Confine thoughts to Java and servlets via the internet, I can consider how JSP pages (Java Server Pages) has evolved the HTML form building and reading process. Java has gone from servlets to JSP to Struts and tag libraries in general, to JSF (Java Server Faces), level upon level of language and tagging, etc, etc, but at the end of the day, I'm still largely processing a form with pick-lists (Html Form Selects), radio buttons, check boxes, input fields... which I could do in HTML forms anyway, long ago. Ruby on Rails is an example of an easier less cluttered approach.
Confine thoughts to the richer interface aspect, one could always step outside of the browser but still use the Internet infrastructure (particuarly HTTP and SMTP) see the MelbourneIT Creator approach (cir. 1997), as a practical and early example of such an approach. I.e. Java was the first mass-adopted language that was designed for the Internet from the ground up - so you could develop applications which had full GUI interfaces using Swing, but which also used the Internet to access central resources and communicate with a multitude of users, and could make the results available in standard HTML4 - i.e. Rich Interface authoring tools. If we want to go to Rich Interface applications that use the Internet, why stay inside the browser? It certainly makes sense to stay inside the browser in two circumstances: when we want to reach hundreds of millions of everyday users; and when we require the user to be sitting at the computer to do the interacting.
Its as if developers are suddenly being 'herded' into a particular direction - RIA - where they stay within a browser delivery system, but, they can go beyond the straight-jacket of the HTML4 interface. A particular model that suits web server based application programs, where the users data and the users programs too, are on the server, and controlled by a service provider.
But which way does a software or a web developer go?
AJAX initially seemed a bit primitive to me to represent a paradigm shift, yet it is certainly responsible for this avalanche of follow-on tools and directions. Yes, the page-at-a-time user interaction pre-AJAX was atrociously slow and woefully inefficient, but, I can see why the term 'raw AJAX' has come into existence.
FLASH is something I wasn't keen on in the past, largely because I subscribed to Tim Berners-Lee's "insist on HTML 4 if you want the Internet to remain an open standards system", and, also, it contributed significantly to making the www into the 'world-wide-wait' - yet, broadband is making this second objection less relevant. I didn't have much need of animation, which is a niche that FLASH has admirably filled. I've been tempted to learn it when my kids used it in their school assignments in multimedia projects. Sure, the Flash plugin is on 90-something percent of desktops, but that doesn't make it universal nor an open standard. But look at the people dabbling in FLEX: when someone the calibre of Bruce Eckel is getting right into it, that certainly makes someone like me sit up, take notice and re-evaluate FLASH in the form of FLEX. I suspect the real attraction with FLEX is that you can develop desktop applications that use FLASH - so that the same skills and tools become useful on both the web and on the desktop/handheld, and as mentioned, there's a generation of school kids who have/are cutting their programming teeth in FLASH (in the past, it was Visual BASIC, and much fewer of them) and thats got to be a significant and influential cohort - the near future multimedia artists and designers - as we are closing in on the second decade of the 21st century.
GWT and Google Gears look interesting, but I get the feeling I'd end up with all my applications running within their server farm somewhere down the track - whether or not that is a good or a bad thing, probably depends on the application and its contents. Again, I think I need to embark on a project to see for myself how useful GWT and Gears really are.
... There is certainly lots of movement-in-the-camp as the word gets around, but which way is the colt of old Regret going to bolt?
Comment 1: by Russell Robinson, www.RootSoftware.com
Tue, 25 Sep 2007 11:36 AM
I'm reminded of a phone call I got from my father in 1982. He was very worried as I started out in the computer industry because he had just read a newspaper article that said programmers would be obsolete in a few years.
What was set to change the way software was to be developed 25 years ago? Fourth Generation Languages. Yes, most of you probably haven't heard of them.
They hung around for a few years, some were successful, but one thing was absolutely clear: programming was pretty much the same, and required the same skill set. Yes, there were a few more skills to add and some things got easier, but nothing really changed.
Let me compare how I design systems and write code now with how I did so 25+ years ago:
- I need to know a few more things.
- I need to know a few less things.
- The languages are now mostly Object Oriented, which is a major benefit.
Apart from that, it's the same.
And yet we (the computer industry) still keep trotting our "new" languages (most of which still seem to be making mistakes that were made in languages from the 1960's).
Haven't these so-called language designers and the programmers who think these "new" things are amazing ever heard of some key principles:
- If the language doesn't force you to declare variables before using them, 99% of your bugs will be caused by uninitialized variables (and you'll still have all the other bugs you would have had without these additional ones). Lesson learned from Fortran circa 1960's.
- Languages that use spacing (indentation) to denote semantics cause the programs to be difficult to parse, difficult to add temporary debugging or code disabling, subject to subtle logic changes due to indentation changes, and dependent on certain editor features. Lesson learned from Cobol (and some others I've forgotten) circa 1960's.
That's some things I've personally seen in the "latest u-beaut" new
I guess it's the problem with letting children play with big boys toys.
Here's the future as I see it:
- There will be more to learn.
- There will be more to ignore and unlearn.
- It will continue to be hard to decide what to ignore.
- My general principle for this is to ignore anything until it has been around for at least 5 years and is continuing to be used
and its growth is continuing.
- Wise technologists will continue to eschew proprietary
technologies. This is actually the biggest lesson from the last
15 years of the 20th century.
- Software design will still be software design.
- Programming will still be programming.
- Browsers will continue to be the way to internet-based applications unless every computer comes with a standard install of a language runtime environment that is truly Internet-based and all servers come with a standard install of what this language needs to run. Java should have been it. History tells us it clearly isn't. It started as proprietary (and still is), right? Maybe that's why.
- Quantum computing *may* change everything. But I've heard that before.
Comment 2: by Steve Goschnick, DigitalFriend.org
Tue, 25 Sep 2007 11:01 PM
- Apart from lessons, a positive that arose out of Cobol was the DBMS. Though no longer a significant force, the early Codasyl databases gave rise to SQL and the Relational DBMSs.
- So called 4GLs are alive and well in the CASE tools for RDBMSs. E.g. Computer Associate's ERWIN - there are several low cost CASE tools around in any one year, but they usually get snapped up and/or withdrawn from the market, probably by the CAs and IBMs of the world. (Although CASE has moved into the UML market, and now we being told Model Driven Architecture (MDA), will do what the 4GLs of yesteryear were going to do.) They are used to develop the backend server structures rather than the client-side interfaces. The dismissal of the relational DBMS by may OO proponents, helped many of them not to foresee the place for the DBMS behind Web 2.0. But granted, regardless of CASE tools a good database design, as with a good program design, still dependents largely on the person wielding the tool (= 'software design will still be software design').
- 'Programming is programming' is not addressing the user-interface aspects of programming, which has changed radically over the 25+ years. And its the human interface that is the key feature to Flash and why AJAX has come to the fore (because of the straight-jacketted interface that the open-standards (e.g. HTML, CSS - both non-proprietary) in the browser has lumped us with). JavaFX is all about the human interface, so maybe that rules it back in? In my view - well, my 'early view' of it - is that its about making Java Swing and the Java 2D interfaces easier to write and maintain, and Swing has been around since 1998. If you want to remove the human interface from the 'programming', then Fortran is probably the greatest exponent of such an approach - command-line interface only. Maybe that's why it has survived (46+ years) to make it to Fortran-2003? There's a thought: Fortran-2003 should go wonderfully well with the browser as interface, since it's never had any other interface from which programmers can justify any disappointment;) ... but thats the current niche of the Perls of the world.
Comment 3: by Russell Robinson www.RootSoftware.com
Wed, 26 Sep 2007 6:35 AM
I think we're both trying to cover too many subjects at the same time. So, I
both agree and disagree with parts of what you replied, depending on what the
subject actually is.
Comment 4: by David Bagnara
Wed, 26 Sep 2007 9:30 AM
In the Age recently where a couple of articles about application delivery across the internet. Maybe this is where internet programming is headed.Delay in local launch of SAP's BusinessByDesign service
German software giant SAP last week unveiled its software-as-a-service deal for medium-sized businesses, a huge step into the mainstream for this trendy technology.
Software-as-a-service pioneer Salesforce.com has unveiled a new strategy to become a global platform for business services delivered over the internet.
Comment 5: by Steve Goschnick, DigitalFriend.org
Wed, 26 Sep 2007 5:35 PM
Yes interesting articles, thanks for the references - I hadn't previously seen them. I'm a big believer in the potential of Web Services (calling upon them from the desktop - in a standalone or orchestrated manner), but I'm not as keen on SaaS - however, as you point out, none other than SAP themselves are going that way, fast. Many early standalone web services seem to have gone - e.g. I had been using a Currency Exchange Rate web service at xmethods.net since 2003, in student assignments etc (and its referenced in numerous textbooks that cover web services) but it seems to have completely vanished some weeks ago. Many other less general services went the same way, well before it. I think some of the SaaS companies may be gobbling up some of the more prominent web service directory companies.
Comment 6: by Neville Franks, www.getsoft.com
Wed, 26 Sep 2007 9:23 PM
A few thoughts.
Many programmers (people) get bored easily and look for new things to do, ways to work etc. So along comes killer language x and if the phase of the moon is just right, lots of developers hop on board and the snowball starts its roll. They don't necessarily use language x because it is better, faster, more succinct, more productive etc. but because they can and its the flavour of the month.
The same holds true for everything to do with software development, be it DBMS's, UML, Aspect Oriented development, operating systems, virtualization, you name it. Programmers love to tinker and try new stuff.
Do they consider what the end users want or need, the best UI design, the most flexible way to code business logic, probably not. They'd rather spend their time inventing language X1. It is a safe assumption that most of us have written one or more languages in our time.
Then along comes Web Apps and here is a new playground to try. Are companies actually making money developing Web Apps, maybe a handfull. How many are developing Web Apps - heaps. Why, because they can. Do end-users want web apps, yes some do.
Now I'm not saying there isn't a need for Web Apps, there is no doubt they have an important place. But right now its just like a land grab, with everyone trying to produce the next app that Google, Yahoo or MS will buy for a gazillion dollars.
The most important aspects of programming that clearly effect my productivity are libraries and tools, such as IDE's. OO has delivered real benefits as well, but the days of having to write every line of code ourselves is thankfully over. STL and Boost for example have made me more productive. And languages like Ruby have amazingly comprehensive libraries. Ruby is also a very expressive language, meaning I can do more with fewer lines of code, which has to be a good thing.
Oh and the next cool language area that everyone is talking about is functional programming. I used to think C++ was gibberish written by monkeys. FP takes this to a new level.
The more things change the more they stay the same.
Comment 7: by Steve Goschnick, DigitalFriend.org
Wed, 03 Oct 2007 5:35 PM
Interestingly addition to the mix re FP. They currently teach 1st year student in computer science (CS) and software engineering, here at Uni of Melbourne, the functional programming (FP) language Haskell, along with the C language - Haskell is filling a spot that might have been filled by Prolog in years gone by. They've been doing C as the introductory language for at least 12years+ that I'm aware of. However, putting FP in such a spot in the curriculum is hardly a vote for commercial success and innovation in the language. It is a vote for longevity though.
BUT, having just written that, it is no longer valid here - i.e. the whole curriculum at UniMelb is currently being redeveloped (thousands of subjects, hundreds of courses) in the New Generation degrees (4 of them only). After looking far and wide at other university courses worldwide, future trends, etc, a select group of computer science and information system boffins here have settled on the following emphasis: the course 'Informatics'; one of two opening subjects: 'Informatics 1: Practical Computing' - "The subject covers fundamental programming constructs, algorithms and data structures; information visualisation; web-centric computing; and an overview of the field of computing. Workshops and team projects will give students practical experience in solving data-rich problems involving computers, people and the Web. The problems will be drawn from a diverse range of topics, e.g. climate change, finance, social networks, and language analysis."; the language chosen for labs: Python (disclaimer: I personally had no involvement in the whole process). This seems to be a genuine attempt to merge Computer Science and Information Systems (= Informatics). See full list of Informatic subjects and their descriptions, here. Informatics 1 & 2 are first year, 3 & 4 are second year, 5 & 6 are third year (Notice that the Bachelor of Computer Science has its last intake of students in 2008, same with Bachelor of Information Systems).
Comment 8: by Dave Gardner, www.BigSitesDoneRight.com
TUE, 02 OCT 2007 3:38 PM
Where is software development going? I see a couple of revolutions brewing and a continuing evolution in the quality of the systems developers use. Every new language, every new library, and every development environment is inspired partly as an attempt to overcome the limitations of preceding attempts. Both libraries and languages have evolved. The libraries I use today are an enormous improvement over the original C libraries, just as C# as a language has benefited from the C -> C++ -> Java -> C# evolution. Even Visual Studio has improved, although not nearly enough in my opinion. Apart from this relatively quiet ongoing evolution I would like to think there are some big changes coming as well:
Browser Replaces Operating System OR Operating System Replaces Browser
I hope one or both of these headlines is in my future. I think the driver will be a vision of lightweight full-function applications that work from anywhere and can connect to your GPS, camera, microphone, home automation system, video, scanner, RFID device, etc., and allow you to use them from anywhere. Just point your mobile phone at the RFID-labeled toddler and tune in on any web browser connected to the Internet to see him crap his nappy. You won’t really be able to do this until the browser gets out of the sandbox to which it’s been confined.
Workflow Rules, OK?
Workflow will reach us all. We will be unable to avoid being represented as just another series of events in a multitude of systems. All your interactions with business, banking, utilities and government will be implemented using formal BPM systems. If you think dealing with bureaucracy is bad now, just wait!
Despite potential problems this is also an opportunity for bureaucracies to standardize, clarify and simplify common procedures, and hopefully redirect resources to dealing with less common situations.
There will be fewer and fewer programmers in this space, and an increasing number of business analysts, specialists in directory services, mail systems, queuing, virtual computing, etc. Many programmers will do nothing more than write glue to stick big systems together on top of a pervasive BPM substrate.
Visual Studio is a Lie
‘Visual Studio’ is pretty poor really. But I think visual development environments will attempt another takeoff. There will be a move away from writing code directly and a swing back to some of the code generation environments that were popular some years ago. I think the major difference will be the development of underlying languages that model the domain well (a la SQL for the database domain). Instead of simply generating masses of code as a one way street (I’ve finished designing so now I hit the Generate Code button and it’s finished, ha ha!) newer development environments will use high level languages that explicitly model the problem domain so you can easily develop incrementally. Even Microsoft recognize the value in this approach and will use it to turn Visual Studio into a better development environment.
This will have a major impact on the way developers work because it will require them to trust high level, truly visual development environments – something that has been anathema for many programmers till now. A lot of older programmers (us?) will not make this transition.
Comment 9: by Russell Robinson, www.RootSoftware.com
Tue, 02 Oct 2007 3:55 PM
Notwithstanding, my previous remarks, I agree that *some* languages are improvements.
Clearly, if I never had to write C again, and could only write C++ I would be quite happy with that.
If Java actually did what it promised *and* I could distribute products without download 3 terabytes of runtime environment on the end-user's computer, I think I'd like Java even more than C++.
However, my comments regarding languages not learning from the past was related to some things I've seen recently: Ruby and Python.
PHP has the disease too, but you can switch on a flag which means that protects you from the disease.
C# should not be seriously considered. Proprietary languages that work on one pseudo-operating system have no place in the real world.
> Even Visual Studio has improved, although not nearly enough in my opinion.
How many years? And it's still crap.
> This will have a major impact on the way developers work because it will
> require them to trust high level, truly visual development environments –
> something that has been anathema for many programmers till now. A lot of
> older programmers (us?) will not make this transition.
No, but if it's pretty and the marketers tell enough lies about it, enough people will adopt it.
Personally, I find it a very simple choice: use things that work.