Aside from knowledge of the jvm – profiler technology is another critical skill to possess in our toolbox. It pays to know how to use them and how to leverage new innovations in them to achieve insight into what your code is doing. Speaking of which jprofiler 7 is released with a number of rather interesting features.
It’s interesting that they’ve chosen to add higher level introspection features such as with jdbc. I’ve always felt that introspection tools shouldn’t just regurgitate verbatim whatever they are analysing but should in addition draw conclusions from such lower level details and offer higher level insights.
A classic example of this would be Eclipse Memory Analyser Tool. It doesn’t just give you a listing of all objects in the heap like most other analysers. It reviews the heap contents and is able to draw conclusions and create higher level reports such as leak suspect reports and dominator reports which not only give you the required numbers but also graphical representations so that you can instantly tell what’s going wrong but then it also allows you to drill down into details. I think this is where introspection tools should be heading in general progressively.
On the train, back to the real world, this evening for another year at work I came across a fascinating article in the New York Times titled ‘Electronic Trading Creates A New Trading Landscape – The New Speed Of Money Reshaping Markets‘. For the duration of that journey I was wholly engrossed in the article and the radial thought processes it triggered effortlessly and constantly on technology and finance. It was an inspiring read and one that made me glad and relieved that I happened to work in this industry.
Predominantly it talked about how, over time, smaller exchanges (such as Direct Edge) had reclaimed the overwhelming dominance and market share of the historic exchange duopoly of NASDAQ-NYSE and how, during that process, New Jersey had been transformed to become ‘the heart of Wall St’ through the placements of data centres within it that now host and operate some of the largest stock exchanges in the US. The charming reference to a ‘Tron landscape’ was made in the article based on the likeness of the blue phosphorescent lighting used in the datacentres for illumination to that in the film.
More interesting to me, however, was the story of how this progression had been driven from its core by the breakneck speed and sheer extent of technological automation, advancement and innovation leaving traders, regulators and the market struggling to keep up in its trail. So where are we now? Exchanges are distributed, leaner and more competitive. Through colocation, software advancement, closer proximity to targets and with new fibre optic pathways constantly being laid between critical geographic exchange data routes trading is faster. Through high frequency trading, dark pools and strategic algorithms – trading is more intelligent allowing arbitrage and price exploitation through micro trading under stealth.
What have been the consequences of such advancements over time however? The use of HFT to place a very large bulk order in small increments was found to be the root cause of a market crash last May when this particular HFT algorithm continued placing trades as part of a large order despite prices sinking part way through. As a result the SEC and the exchanges introduced a halt to trading on individual stocks if the price fell more than ten percent in a five minute period. Dark pools have been in the spotlight for being opaque and exempt from public scrutiny. And there is talk of regulation not only of data centres and colocation but of perhaps technology’s greatest achievement of all – speed. The unattended and perahps ill-considered advancement of technology for mostly selfish motives has resulted in a disproportionate loss of control, transparency and ethical considerations away from human discretion and towards machine code. Can technology continue to dominate this industry progression at its core to its advantage or will it become the very victim of its own success? I wonder where we go from here. What do you think?
Hot on the heels on the last post about BumpTop comes a post about something very similar in terms of modelling our real life environment. Remember the electronic newspaper that refreshed automatically with the latest news in Back To The Future? Well “Plastic Logic” have developed an electronic paper technology on plastic in collaboration with the University of Cambridge.
BBC News reports on Plastic Logic
CEO of Plastic Logic speaks about the technology
The potential applications of this technology are innumerable – practically anything that is static at the present time could be made dynamic with applications specific to its function or role. Let’s see where they get to in a few years. The source of this news is a friend of mine whom I can’t link to because he has no web presence.