From a disciple of evolution

Linden Lab has built famous virtual ‘3D’ world namely ‘Second Life’ (let’s call it SL for short). It was talk of the town, as I remember for two good years in 2008-2010. Although the idea is very interesting, since then, it has been overshadowed by ‘classic’ social networks such as Facebook. What could be a possible way to revive SL and bring it into main stream again? I guess it needs a change in its business model. So what is its current business model? In SL, anyone can log into and roam around – It’s a free world. However, one can buy ‘land’ in this virtual world and set up interesting ventures, such as buildings, artefacts, exhibitions and so on. And just like a website, people purchase goods. If this is all promising, what part of business model is pulling SL back? My guess is – It is an empty market, probably because of incomplete market ecosystem. So if it can be built, SL should thrive again. Is there any example where market ecosystem is built but it does not have capabilities of SL? There are at least two – Amazon and eBay. It can be any, but Amazon looks slightly better.

Why Amazon? Amazon addresses a broad spectrum of market ecosystem – From product catalogs to selling goods to servicing to analytics. Moreover, it is a very integrated ecosystem from point of view of both, consumers and producers. With SL at their side, Amazon should be able provide personalized services such as help-desk integration, virtual salesmen and so on. Virtual Reality integration should close the gap vis-à-vis real shopping experience. Only missing thing, as to my knowledge is ‘auctioning’ and that’s where eBay has slight edge.

Why eBay? It is literally a virtual marketplace. Individual buyers and sellers make deals and eBay facilitates. There can either direct deals or auctions. eBay provides transparent information about sellers such as feedback and so on. If I were to imagine a virtual marketplace in SL where consumers are approaching a virtual auction in 3D, it sounds exciting, doesn’t it?

Finally, what is your take? Please leave it in comments below.

Advertisements

What is common among ‘Intel 8086’, ‘Microsoft Windows 95’, ‘Google Search’ and ‘Apple iPhone’? These are innovations that changed the World forever. When such a technology is launched, the World sees a ‘Tipping Point‘ or ‘tip’. Such innovations are celebrated but they hardly happen in isolation. Each one requires sufficient progress in related fields. For example, ‘Apple iPhone’ could take advantage of progress in energy-efficient microprocessors. From gaming industry too, I believe such an innovation has just happened. Today Simcity World has been released by Electronic Arts. There have been many versions and variants of the classic Simcity and most of them were popular among gamer community. However, this time it is different. No, I am not indicating popularity, but about usage. Let me try explaining.

Intrinsically games have been for entertainment. Simcity player enjoys the construction and administration of a virtual city. The gamer plays for hours in building a city from ground up, expanding and governing it thereafter. Patience and passion are not only virtues but also sort of ‘requirements’ that are not mentioned on the shiny box the game packaged in. However, one can only do so much being alone. For example, all the resources needed by a city should be available within the city’s limit and most importantly, there is no competition. When players can connect their cities with one another, they cooperate, compete and evolve. It is analogous to connecting one’s PC to the Internet – More the people join, better it is for everyone. Now how can this change the usage?

When cities connect, there is more dynamics poured into the game, and it can rival the dynamics of the World. This is an awesome thing for understanding global dynamics. Let’s take an example. Say there are 10,000 cities in the World, each with its unique geography, environment, resources, cultures, people and position in time.  It is enormously complex to imagine and deal with. So how should a city respond to a change, such as a drought or a new high-speed train line? How would the resource be utilized and people be given jobs? And more importantly there will be ripple effect as inevitably, the cities cooperate and compete. These avenues can be explored, discovered, to understand which approximate causes will lead to an aggregate effect, in this non-linear World. All this experimentation is impossible unless there is inherent potential to achieve at truly global scale. This is the new usage and it should differentiate Simcity World from many other games.

It is hardly matter of time to map entire Earth into such a virtual world, it is a relatively static issue. However, the real issue is dynamics, computation resource for it and computational infrastructure. Thankfully, few challenges can be tackled using recent innovations in computing infrastructure such as energy-efficient  (ARM/Intel Atom-based) multicore microservers, OpenStack cloud infrastructure, high speed fiber internet and so on.

Henceforth it is worth watching how this space develops, especially competition (such as SecondLife) and creativity.

Yesterday I read about Intel’s upcoming Xeon Phi Co-processor with 50+ x86-compatible cores. As per the graphics, the co-processor will provide a teraflop of performance and occupy just one PCIe slot. It’s great to see that Intel and other vendors are able to provide such phenomenal computing power,  by adding cores in a single processor chip. We always knew that GPUs with their large number of cores offer phenomenal power in SIMD mode. Moreover GPGPU has unleashed the power for other than graphics-intensive work. However, for Xeon Phi, it does not seem to be strict SIMD only. After reading these news, I confess that I am tempted to look at the field holistically and just blog my muddling thoughts in some order. So what is the theme? Theme is ‘Computation is following biologically evolutionary path’. First, I will try to articulate past, present and vaguely the future of the microprocessors. Later, I will attempt to identify similarities with biological evolution.

Evolution of Intel and AMD microprocessors (as representative) till today –

  1. Inetl 4004 – First single chip microprocessor.
  2. Intel 8086 – First microprocessor with x86 instruction set.
  3.  Intel 80386 – First 32-bit microprocessor and built for multitasking.
  4. Intel 80486 –  Microprocessor with inbuilt math co-processor. This marks beginning of heterogeneous micro-architecture era. However, it could not go much further for a decade.
  5. Intel Pentium – Superscalar implementation for x86 architecture and with multiprocessing support.
  6. Intel Pentium D – First multicore microprocessor.
  7. Intel Pentium M – Introduction of energy efficiency features.
  8. Intel Core –  Scalable and energy efficient microarchitecture (till today, it supports up to eight cores).
  9. AMD APU – First microprocessor with inbuilt GPU cores. This is rejuvenation of heterogeneous micro-architecture and now has momentum.
  10. Intel Xeon Phi – First Intel microprocessor with many-integrated-core implementation supporting x86 cores. Moreover, this fits in the system as an add-on card with its own little Linux OS.

In future, it may continue as –

  1. 1000+ core microprocessor. Each core will be a simple one (most likely an ARM-variant).
  2. More hybrid processors will be launched. For example, a processor will have ‘8 cores’ such that ‘4 x86 cores’, ‘1 ARM core’, ‘1 GPU core’, ‘2 ASIC core’ and so on.
  3. Reconfigurable microprocessors – Processors can have emulation mode. For example, a processor can be configured to have all ‘x86 cores’ in the morning and all ‘ARM cores’ in the evening, using system BIOS.
  4. Upgradable instruction sets. For example, I can upgrade from Core i5 to Core i7 and that too only for a few cores. It appears that upgradeable instruction is required for reconfiguration, but not strictly. Reconfigurable microprocessors and upgradeable instruction set microprocessors may follow one another, in quick succession and the order of their arrivals depends upon level of flexibility achieved for each requirement.
  5. Computing dust. Processors would grow smaller and smaller to an extent that a ‘not-so-advanced‘ processor with size of a grain or a even dust particle. I cite Hitachi RFID powder chip, although it is not a microprocessor, as a beginning. What is significant here is organization of these resources and their interconnects. A liquid network medium is quite possible and may provide substantial advantage over contemporary ones. (Let me call it a ‘Swimming Tank Interconnect’ 😀 )

Quite exciting !

I don’t want to attach any dates to these milestones, but given the trend, we can say with fair accuracy that amorphous computing should become reality by 2016 and part of every day life by 2022.

Each of these would demand significant reorientation in software development paradigm, especially the last milestone. In a separate post, I will articulate each of these challenges and possible paradigm adaptations.

…to be concluded !

It’s been long time since I got chance to write something here. A lot many things I feel like to share with and I hope this post would be a good start.


In Feb 2007, our team informally discussed and decided to present things which one liked, learned and wanted to share with others. I chose ‘Evolution’ as the topic of presentation. I delivered a couple of presentations but could not stop the reading on the topic.

Evolution is an interesting process. It is the history with a running context. A context gives a perspective to an observer. One can have many simultaneous contexts and thus many simultaneous evolutions. However when one applies a higher level context to all the running contexts, one can see a higher level evolution – Evolution of evolution (or jargon-ishly ‘Meta-evolution‘).

Evolution is a passive continuous process. It does not play any part, rather it is the script under constant development. One can not sense evolutionarily important cause unless it leads to perceivable effect(s). An observer (who has a context and a perspective)  chooses meaningful patterns from the insignificant routine.

Although it’s been studied prominently in biology, it’s not uncommon to see patterns of evolution elsewhere. This series is such an attempt, to see the World through the lens of evolution. Feel free to comment, as they would add different perspectives and would help us co-evolve. 🙂

I was exploring on internet about Oracle’s offer for Bea systems. During that exploration, I encountered a blog post which discusses the Oracle’s offer, around many interesting facets, including a speculation about merger between SAP and IBM. Sometime back I read about Microsoft’s interest in Yahoo. Years before we saw a huge merger between HP and Compaq. All these inputs triggered a thought in mind my mind – a merger between Adobe and Apple. It might be my flight of fancy, however the merger can have many practical benefits, to each of the entities.

Adobe has strong foothold in applications and products such as Adobe Photoshop, Adobe Acrobat, Adobe CS3 and now Adobe Flex. In the era of Rich Internet Applications, Adobe’s Flex should play a very important role. Historically flash has been a popular portable runtime, but only for browsers. Adobe Integrated Runtime provides the same capability on desktop so that Flex applications can be run as if native applications. Apart from these products, Adobe supports ColdFusion, a server-side web platform, similar to ASP/JSP/PHP/Rails. Very recently there was a news indicating Adobe’s interest in online office suite. Having a good foothold in the products space, what Adobe does not have (or at least I don’t know) is exposure to hardware platforms, appliances and operating systems. This is where Apple’s expertise can support to deliver dramatic results.

Apple has been the choice of connoisseurs. Mac, MacOS X, etc have become popular and their users are difficult to be convinced to switch to another platform. Innovative products and services such as iPod, iPhone and iTunes helped to develop a creative and positive image in the minds of people. Despite all these success stories, Apple could not be successful in application software, comparing with its platforms and appliances.

Apple’s experience in hardware, platforms and services combined with Adobe experience in applications, platforms and development tools complementary for each other. Of course Apple is larger in terms of revenue, employee strength, number years in business etc. Two choices are available (as I see them) – Merger or Collaboration. Merger of Adobe and Apple can emerge as an entity which can be more innovative and more competitive, to play an important role in years to come.

But after all, this is all day-dreaming…

Over last 12 years, Java has become almost de facto in application development paradigm. Initial days were complaining about the performance of Java programs. However there is no doubt that enormous efforts that have been put in optimization of Java compiler and JVM implementations, have given handsome returns. But we know, rather we need to know, that there is an upper limit to this optimization for performance, being implemented as a software. Despite Java’s wide acceptance, Java Virtual Machines are limited to be software deployments. There is an emerging need, to have Java Virtual Machine in  hardware.

Fortunately the space is not an entirely unexplored territory. There were several efforts to implement Java processors and including PicoJava, one of them from Sun Microsystems. It seems a very promising concept and it should become more and more relevant in days to come. Imagine a system with many cores, for example ‘SUN UltraSPARC T2‘ that has 8 cores per CPU. Now all these cores are identical and a server with 8-way configuration would have 64 cores. This kinds of systems leave a lot of room for something called as ‘Domain-specific Processors’, hence it makes lot of sense to have four dedicated Java processors part of the system. One of such example is presented by IBM for its System Z Application Assist Processor(zAAP). Primary benefit of having such processors would be their specialization. Such processors can be optimized to a larger extent, they can be upgraded frequently and would be cheaper. Apart from that, these processors leave the main general purpose processors free to do their tasks. Thus a Java Processor can be a co-processor to your main processor. Remember the known examples such as ‘Intel 387’ or today’s Graphics Processing Units (GPUs). Checkout some benchmarks for IBM’s zAAP.

Another very interesting initiative is from Bea Systems, that talks about JVM Hypervisor. This can, meanwhile, provide some breathing space. The idea was, I guess, first presented by Joakim Dahlstedt (CTO of Bea) at JavaOne 2006. One can find PDF of the presentation here – “Bare Metal”—Speeding Up Java™ Technology in a Virtualized Environment.

It is almost always initially easy and later on painful to mix ideas. Same happened for me, for a while. Hence I decided to start another blog, ‘http://clairvoyant.wordpress.com’.

So to make it simple –

I hope this would be working fine.