Archive for the ‘News’ Category

Richard Feynman on Video

Sometimes you have to love Bill Gates: With project Tuva the old recordings with Richard Feynman, The Messenger Series from 1964, is presented now online and extended with a lot of additional extras which are linked to the content of the lectures itself. Richard Feynman was one of the best known Scientists of the 20th century and famous for his very well written books and his humour. And yes, he also was one of the greatest physicists of the century.
If there ever was a reason to install Silverlight, this is one (if only it would also be available for Linux).

Using Twitter as micro blog

Because I’m sometimes a bit lazy, I decided to use Twitter as my micro blog engine. My current tweets are visible on the left side. In the past I’ve never thought that Twitter is of any use for me but it is a useful way to post interesting information snippets quickly without the need to write a lengthy blog entry (as I told, I’m lazy).

Microsoft CHESS for managed code available!

Presented on the PDC 2008, Microsoft has finally released the first version of the concurrency unit-testing tool CHESS. It was available already for Win32 applications but is now also available for managed Win32 code and integrated in Visual Studio 2008. Because CHESS is controlling all threads and their schedule while executing your code it will find “Heisenbugs”, means it is possible for the first time to build unit tests which are capable to test concurrency reliable.  This is a unique capability which I’ve not seen  in any other  test framework,  so it will be interesting to run it against some code of my own as well as from 3rd party libraries…
The only downside so far is that the Visual Studio version needed is the Team System one, because of the Microsoft unit test framework, which is not available to everyone. But Microsoft provides a trial version, valid until December 2009, as Virtual PC image so there is a way to use CHESS.

QCon 2008 San Francisco Slides

The slides of this years QCon 2008 in San Francisco are available here in public.

Microsoft BizSpark

Microsoft has launched it’s BizSpark program which allows Startup companies to use most available software free for 3 years, if they exit the program early they have to pay $100 or they by normal licences after the three years. With the coming launch of the Azule Cloud platform 2009 and the hype after the election of Obama, very good timing … although some will see the snake with the nice, big and juicy apple 😉

This years IRFS 2008

My former company arrange again this year a symposium, the first time with a, hopefully working, livestream. So if your are interested in Information Retrieval or patent knowledge take a look.

The presentations and videos of the conference can be found here.

Why Microsoft may be in the lead again

On this years PDC Microsoft has shown several interesting products and projects which maybe will give them a step or two in advance to the Java application stack:

  • The Microsoft Azure Platform: Microsoft finally gets into the cloud hype, after Amazon and Google. Azure is highly integrated with Windows, naturally, but I think for the first time in history the outside world is supported from the start. OpenID will be used as authentication service, also nearly all services are accessible by REST or native clients in Java and Ruby. Azure by itself could provide what BPEL and W3C Web-Services have promised but never provided, a easy to implement solution to collaborate between different process participants without investing into new middle ware systems.

  • The evolution of C# continuous: Not only that C# 4.0 will repair the generic type systems, finally Co- and Contravariance are supported, they also will introduce a very neat solution for handling dynamic data types. This will be a big advantage over Java where generics will not be corrected in near future, at least not with Java 7, and although Groovy and JRuby are extremely good replacements if it comes to dynamic languages, it hurts that the original language does not evolve (but Scala and Clojure are showing what could be possible). Backwards compatibility is already broken, so why not go after C#?

  • Concurrency in implementation and test: Besides the already existing Parallel Extensions for .Net FX, which are also part of the Visual Studio 2010 CTP, Microsoft goes a step further with the Concurrency and Coordination Runtime (CCR) and Decentralized Software Services (DSS) Toolkit. In my opinion this is the most interesting and workable solution for computational grids I have seen so far although the Java frameworks GridGain and TerraCotta are also very nice. But Microsoft has found and API which successfully abstracted away the nasty synchronization locks and gives a very RESTfull way to monitor ongoing activities. The second interesting project is CHESS which allows it for the first time to really test concurrency in implementations. Normally finding Heisenbugs is very tedious and you must always have a bit of luck. With CHESS, they can be found much faster and backtracked to their cause.

  • Oslo as Meta-DSL: Maybe this will not be the final solution, but Oslo is a nice looking Meta-DSL which allows it to define textually grammars for DSL or schemas. Mostly textual descriptions are a lot more maintainable as graphical descriptions and it is definitely easier to process them as XMI or MOF models.

Randy Pausch died today

Today Randy Pausch finally died of cancer. He gets famous by the creation of the Alice software project and lately by the book “The Last Lecture” which he wrote fully aware of his cancer. In his book he tells about what was really important in his life and how he handled the diagnosis for cancer. It is worth reading!

IRF Symposium 2007

I’m currently working for a company which is a startup in the information retrieval domain founded a year ago a open community platform called Information Retrieval Facility. An international symposium was organized last week this were my impressions:
My impression from overhearing some discussions is that basically, from my point of view, there are three overlapping problem fields: first there are the pure engineering based problems which are sometimes caused by not being aware of current technology and possibilities, in fact most of the solutions are 10 or more years behind the current State of the Art. With professional software engineering, EVERY company is able to solve them, the only thing matters if the technical solution itself supports the actual revenue generating product.
The second problem field is “The big ball of mud” of the patent information itself. The raw information itself is very inconsistent, has a lot of errors, comes in nearly any possible flavor and format, is not standardized in any way (and will never be in near future) and is, nowadays, more a legal document as a technical description. There is no other way as to handle each small bucket of mud carefully and with as much domain knowledge as possible. Because patents are human oriented, only actual domain experts will be capable to work through this. So demand here is to find a way to provide a supporting work flow as well as “Conversational Interfaces” (language oriented, not solemnly voice oriented) to help the mining of information out of the pile of mud. And simple things as flexible filter combinations can help a lot as well as to build up a working feedback loop (OK, not so simple). To drainage the mud, only Google will be able to do this…
The third field is the area of scientific problems in information- and knowledge retrieval. Either on the structural, semantically, language or domain specific area, there are a lot of not so simple problems which had to be solved and there is mostly no currently existing shortcut. The research here goes in two directions: make the information and the actual (meta)knowledge more computational, as well as provide humans tools to help them to cope with the knowledge itself, through navigation, presentation, abstraction or decision making.
Some other common themes were that it is clear that China will be the biggest challenge in the patent world, patent search is a challenging and very important factor at least for international companies and will become more important for smaller ones also, searching in special features as chemical structures or images, feature- and knowledge extraction from pure text are not well integrated into a flexible tool to support search strategies.

So I have to say the event was very well organized and the idea to build a community platform for different professions seems to work out nicely. Presentations and pictures can be found here.

UPDATE: The videos of all presentations now can also be found here.

DARPA Urban Challenge 2007

Yesterday I watched the complete DARPA Urban Challenge through the whole 7 hours and it was worth the time (at least more interesting as some F1 races). Eleven Teams were qualified for the final race, six of them completed all three missions and all in good time. The goal for every bot was to finish the missions as fast as possible, drive without violating the Californian traffic rules and do not collide with any other bot or with any of the other 37 vehicles on the road. The course was very large, with trees and off-road sections, so the GPS has some problems, as well as with some unmapped streets. Every car gets its random generated missions 5 minutes before starting, so no team has known in advanced what to expect.
The first three cars had one thing in common: they drove really confident, especially the Standford car “Junior” which was not only the fastest one, it also drove absolutely perfect. The second one was, as in 2005, “The Boss” from Carnegie Mellon. The last three were also very interesting to watch: “Skynet” and “Little Ben” were very careful and “polite” drivers, they wait on every crossing until any other car has passed and Skynet was by far the best looking car. MIT’s “Talos” was very interesting to watch. They had more sensors and processing power attached to the car as everyone else, trying to process as much environmental data as possible which results in a very “spastic” driving because every 5 minutes the car was definitely not sure what to do. Talos was also the bully in this race, it collides with Team CarOLO and with Skynet, it also ignores most of the traffic rules (infect it drives like a teenager). But because this was MIT’s first participating in the race, it was a very good show.
The winner will be announced today but I thing “Junior” has made it.

So after the 2004 Grand Challenge then only one car droves 8 miles, now six cars succeed in urban environment. In three years it was possible for the technology to evolve very fast and if you think that at least Lexus now starts to build in robotic behavior into the cars (self parking functionality), it is reasonable that in ten years it is technically feasible to build very reliable robotic vehicles. Maybe this was the last challenge, but I think DARPA has shown what is technologically possible in very short time.

 UPDATE: OK, I was wrong. Dr. William “Red” Whittaker has made it this year with “The Boss”, his car wins the DARPA Urban challenge 2007, second is tis time Stanford. He was actually faster with an also perfect driving performance.