If you are a technical type person or if you are currently working a job in the IT world, then you have probably had this happen to you more than once, 'Corporations making bad technology decisions', specifically in the area of buying new software to help the business grow and make even more revenue -> here is fictional story to drive home my concern over this topic.
Company A has decided to purchase an expensive piece of software that will impact the entire company, will take many years to implement and really will not make a difference since the company has already built the software that provides functions that the new software will implement.
So Company A gets it down to 3 prospective candidates to purchase the software from.
Prospect #1 -> company's product is in dotNet. The product is a bit expensive, but the product is well know, feature rich and has brand recognition that is valuable when new prospective clients are evaluating the company to see if they want to use Company A's products and services.
Prospect #2 -> company's product is J2EE and the vendor is located in the same city as Company A. This vendor's product is not as well known as Prospect #1, but matches the platform used by Company A and is local, so support would be very easy to get (they could physically come on-site to fix issues).
Prospect #3 -> company's product is written in a very old, non-enterprise language, python. Each time a customization needs to be made, the vendor is charging extra. This product requires a minimum of 20 production servers to meet the needs of Company A today, not counting on future growth.
So the senior/lead technology employees in Company A are asked to interview / evaluate / research each of the 3 vendors and provide a scorecard of which vendor closely meets the needs of Company A.
The scorecard is posted by the evaluation team (from best fit to worst fit) Prospect #2, then #1, dead last is vendor #3 (#1's score was 2 times the score of Prospect #3).
Guess which vendor was chosen by the top tier employees at Company A? Vendor #3 was chosen. So Company A has now invested millions of dollars on an outdated technology platform. Why? Good question that no technology person at Company A can really answer................
Sunday, September 4, 2011
Sunday, March 28, 2010
Java: The Beginning
The Java programming language was first introduced by Sun Microsystems in 1995 (Java version 1.0). The language actually pre-dates 1995, all the way back to 1991 when the language was called "Oak".
The first apparent benefits of the language was "Write Once, Run Anywhere" (WORA) and Java Applets. Until Java's WORA concept, most languages were written and compiled to be deployed on one operating system.
It was a very important concept for developers to be able to write code once and have that code run on a variety of operating systems such as Windows, Linux, different flavors of Unix, and even Apple platforms. The driving force behind multi-platform development / deployment is how Java interacts with the underlying operating system.
Java applications are compiled into Java byte-code, which in turn is executed inside a Java Virtual Machine (JVM). It is the JVM's responsibility to provide the bridge from Java byte-code to the underlying operating system. Each supported operating system (OS) / software platform has it's own JVM that knows how to natively talk to the underlying OS. It is this architecture that is the true power of the WORA promise of Java.
The past 15 years have spawned a giant inventory of Java applications, especially with the introduction of Java Enterprise Edition (J2EE) in 1999.
This is a very quick and short history of the Java platform. For a more comprehensive history of Java, visit the Java Wikipedia site and the Java History Timeline.
Subscribe to:
Posts (Atom)