Sunday, April 27, 2008

Wharton - Chapter 18

Dr. Jason McDonald told our marketing class something that I will never forget: "Today's employees are more concerned with lifetime employ-ability rather than lifetime employment". This is the central theme of this final chapter of the Wharton book. This chapter explains that a steady paycheck and a promise of future employment are not what today's minds are looking for in a job. People are looking for a job that will accommodate their lifestyle and allow them to have input and a voice in company decisions. Today's successful company must create an adaptable workplace where they not only provide the hygiene factors of employment such as salary, benefits, and an enjoyable environment, but employers must also provide the motivators of employment that people seek out that will give them true satisfaction in the position. Some people will value on-site daycare or flexible working hours, and these "outside the norm" factors may be imperative to recruiting the best minds for the job. Company's must also realize that employees are going to come and go, and the lifetime commitment to an employer that their father's had no longer exists in today's competitive workforce. Human capital is the most valuable asset a company can have, and can be the hardest to replace. The successful companies of tomorrow will recognize this and find ways to extract the most benefit from it.

Wharton - Chapter 17

Chapter 17 is all about the six new forms of organization, instead of the old hierarchical style. Although the lines between these types of organizations are not definite and these are not mutually exclusive, they can be broken into the following forms: Virtual, Network, Spin-out, Ambidextrous, Front-Back, and Sense-and-Respond.
Virtual Organizations are those that utilize technology to bring together geographically dispersed people, suppliers, employee, etc. Network Organizations are built of modular internal or external autonomous or semi-autonomous business units that basically are free to operate as they need to to meet company goals, with little supervision. The Spin-Out organization is a business unit that is its own identity from the large parent unit, that breaks away to become its own organization, with the parent unit primarily just being a financier or offering operating guidance. The Ambidextrous Organization allows an emerging technology to break out of the "innovator's dilemma" and it to be fostered and embarked upon, without the company abandoning its core business in the process. A Front-Back Organization is focuses first on satisfying the customer, then the organization works inward from there. Also focusing on customer needs, the Sense-and-Respond Organization is flexible to adjust and adapt to ever-changing customer needs.

Sunday, April 20, 2008

Wharton - Chapter 15

Chapter 15, when combined with chapter 11, is a complete contradiction. Chapter 11 explains how to keep your knowledge a secret and secure it, then Chapter 15 talks of the extreme importance of sharing knowledge. The important key word in mending these ideas is in the title of Chapter 15: "Managing Dynamic Knowledge Networks". Using a horrible aviation example as the initial base (its abundant number of acronyms can make the nodes and arcs confusing", the message behind this chapter is abundantly clear: Creating a knowledge Network and establishing degree centrality and betweenness centrality usually is a good indicator of a firms' success. Whether it be from alliances, engineers in chat rooms and forums, or industry conferences, a solid interconnect of knowledge sharing can have a very significant impact on the company.In looking for further information about knowledge networks, I ran across many examples of a matter I know as "Knowledge Management" and using Ernst and Young as an example, but I found the idea of knowledge sharing best described in an article by Peter Cukor and Lee McKnight of MIT, "Knowledge Networks, Development, and the Internet": ..."Knowledge Networks, in general..., are expected, by their purpose and nature, to shift organizational culture towards collaborative activities within and among institutions."(Mcknight, Cukor, pg6, 2001). The article can be found at: http://ebusiness.mit.edu/research/papers/120 McKnight, Knowledge Networks.pdf

Wharton - Chapter 11

This chapter explores the four main components a company should use, in compliment to each other, in order to appropriate the gains on their technology. First Patent protection is defined. Patents are good to protect the use of an invention (in a utility patent)for a period of twenty years from the date of application. What the author do not mention is that from the date the patent is applied for and the date the patent is granted can be up to three years (http://www.law.cornell.edu/uscode/35/154.html), and that would be more than enough time to be infringed upon in the IT world. Although the product by the competitor would have to be off the market by the time the patent issued, they would still have plenty of time to copy the invention, and invent around it to avoid infringing on any important claims in the original patent. But patents are good to have, especially for a pharmaceutical company where inventing around is significantly difficult. Second protection discussed is secrets. Secrets are good in that if no knowledge is published (a patent is published 18 months after application)and if the company clearly identifies and acts accordingly to keep something a secret,(http://library.findlaw.com/2000/May/1/130451.html) then this type of protection can be even better than a patent. Coke's formula for Coca-Cola is a humongous trade secret, and they obviously do not want to patent it, because then the formula would be published. Secrecy can get compromised, though, by espionage or spies, and the complexity can become very out-of-reach if the process or invention is very complex.
Complementary Assets is the third protection, and Starbucks was used as an example here. Anybody can sell coffee, so Starbucks knew there had to be something identifiably theirs if they were going to attract and retain customers. Starbucks sells an experience along with their coffee. It is the expertise of the baristas and the decor of the Starbucks stores that compliment their coffee, and those kinds of things are extremely difficult to duplicate. Lastly, and given the highest priority by both the Yale and the CMU study, was Lead Time. Being first to market does not guarantee success, and moving too quickly could compromise quality or service support, but combining first to market with the protection of a patent or first with a product nobody has ever thought about before can be a huge advantage to gaining customer loyalty, relationships, and visibility.

Saturday, April 12, 2008

Wharton - Chapter 10

Chapter 10 is the longest chapter yet, but the author set to explain was certainly an out-of-the-box way of thinking. Typically, I think companies focus on contingency planning, a whole "if this happens, then we will do this", but this chapter shed light on a manner of planning that I had never heard of: Scenario Planning. Using the print industry and Knight-Ridder as examples, the authors walk through the components necessary to construct successful scenario planning.
Purely an imaginative thought process, an organization would put together a matrix and constructing cells out of any wealth of possibilities they may face. Certain attributes would be reliant upon other attributes, such as "may sink in the water" would only relate to "put on a sea vessel", and some are just impossible, such as their example of Full Employment to an economic firm. This matrix would be divided into cells, and these cells provide the basis of uncertainty for the firm to plan upon. This was all completed using a ten step process, and if there are problems, then feel free to start over.
For scenario planning to be successful, it is important to gain the support of top management, get diverse inputs when constructing the matrix, and stimulate new strategic options. Nobody can predict the future, but this mechanism can help an incumbent firm be prepared for it, and not "stick their head in a hole" (Ostrich example) and hope it will just go away. Hodgson (2004) suggests that without explorations of the future, (as opposed to a direct plan with little scrutiny) to deal with possible uncertainties, strategic planning "creates a default scenario: ‘a future that validates the plan and this view of the future dominates … decision making’." Traditional thinking about the future works well in a relatively stable environment, and has proven successful in the past due to the incremental nature of industry changes. Today's innovation-centric businesses and the availability of vast amounts of information available so quickly to so many people has changed the way businesses develop must their strategy, and scenario planning skews away from the traditional to a new arena to accommodate for this fast-paced change.
If anyone is interested, there is a great article on Scenario Planning titled: "Scenario Planning: An Innovative Approach to Strategy Development" by M. Conway of the Swinburne University of Technology. You can find it here: http://www.aair.org.au/jir/2004Papers/CONWAY.pdf

Saturday, April 5, 2008

Wharton Chapter 9

Continuing where chapter eight left off on the development of strategy, chapter nine explains why a company needs to use "disciplined imagination" to develop their strategy. Developing a strategy is an art, not a science, and because of that "trends in strategy appear to alternate emphasis between discipline and imagination" from decade to decade. Both components have inherent strengths and weaknesses, and in come cases the strengths can become instant weaknesses if some extraneous circumstance happens to occur. In this chapter a good example was General Instrument. They were extremely good at disciplined strategy creation, but once the cable business became regulated, they were at a substantial disadvantage.

Discipline is consistent and methodical, and although it does not require a formal planning process, it is easier to identify and correct issues relating to the execution or planning of the strategy. It does have its pitfalls, though, as it is based more on analysis that synthesis.

Imagination is more creative, and should should be the first part used when combining discipline and imagination. This idea has more of a focus on synthesis. Create lots of options initially use diversity to examine and define the problem, as this will include more of a vision of the future, and not a reproduction of the past. Also not perfect, the imagination direction can bring upon chaos, dilution of individual creativity, and slow down the strategy-creation process.

These reasons are why it is best to foster a balance of discipline and imagination when creating a strategy. The way to do this is to generate imaginative options and evaluate the options consistently. "Strategy making can be an art, but in situations of uncertainty, strategy making must be an art."

In Paul Schoemaker's article "Disciplined Imagination: From Scenarios to Strategic Options", he suggests using a core capabilities matrix to integrate three steps:
1.Scenarios will be used to examine the external environment, specifically those trends and key uncertainties that affect all players.
2. Industry analysis and strategic segmentation will help define the battlefield in terms of competitors, barriers, and profit potential.
3. Core capabilities analysis provides the basis for developing a strategic vision for the future.

Use these three steps to derive a vision, and then use that vision to create strategic options.

Friday, April 4, 2008

Wharton Chapter 8

This chapter was a very short one, but also very important. It focuses on commercialization strategies for companies that emabrk upon developing and marketing emerging technologies. It uses half the chapter using Mergenthaler and typesetting evolution examples, and th eother half uses photography to rationalizes the author's rationale. Essentially, the chapter explains how sometimes having a superior technology does not guarantee success in the market place. It also explains how sometimes an incumbent company can have inferior technology, yet still become the survivor in the market. This is accomplished by recognizing four changing components to change their overall strategy: Change in Customers, Change in Competetors, Change in Complementary Assets, and Change in Technology. These components establish the facts that a successful comapny must realize the value in creating customer realtionships and giving customers what they want. There was also a significant focus on the value of complementary assetss. If a firm wants to get a foothold on a market, they may be able to do so with inferior technology if they offer other components to that technology to give it added value. Would iTunes be the number one online music retailer if the iPod did not exist? Or would the iPod be so popular if iTunes never existed? These two technologies face still competition from such powerhouses as Wal-Mart and Microsoft, yet still emerge successful because of the relationship they have with each other. The end of the chapter recognizes the entirely new market segments that have to be explored when introducing new technology, and what technology these new segemtns may replace. The increasing popularity of the digital camera reduced the need for film, and these digital camera users typically had different requirements that did the traditional camera user. Even Polaroid, who started instant film in 1937, will finally retire this technology in 2008 due to the market transition to digital photography.

Sources:
http://www.dispatch.com/live/content/local_news/stories/2008/02/16/POLAROID.ART_ART_02-16-08_A1_AH9CHQJ.html?sid=101
and
http://www.crunchgear.com/2008/04/03/itunes-now-number-one-music-retailer-in-us-npd-numbers/

Thursday, April 3, 2008

Presentation 2

The Future of Dynamic Circuit Networking
A pleasant residual of the dot-com boom is that many parts of the world were left with high-speed fiber optic cables that link many points together. This glass-based system of transport has capabilities of supporting transmissions as fast as 1,000 Gb/s and is 1,000 times larger than the total radio bandwidth of planet Earth. Unfortunately, fast optical networks are significantly bottlenecked at routers that are reading and evaluating every packet of information that is sent through them. Although throughput of routers has increased significantly over time, they are still a far cry away from being able to process information at fiber optic speeds of a Terabyte per second. Internet2 is paving the wave of the future to full utilization of fiber optic transmission lines through the creation of the Dynamic Circuit Network (DCN). Using IPv4 or IPv6 protocol, this revolutionary re-design of packet transmission will the first step of a complete re-structuring of the world’s data network as we know it, discussed throughout this paper.
In traditional IP networking, the packet transmission of data is randomly sent over data lines (this packet takes this route, that packet takes another route, etc.). This mechanism necessitates that every packet generated and sent has to: a.) Contain origination and destination information, b.) Be evaluated by each router along the path, and c.) Be deconstructed at origin and then re-assembled at the destination once all packets have arrived. This has proven efficient in the past for redundancy, in that a single downed line would not prevent the transmission of data, but it does not make the best use of resources available. Conversely, in DC networking the entire path from origination to destination is pre-determined before any information is sent. This virtual circuit is dynamically created for the transmission of the defined data. The circuit is established and reserved for this data, the data is transmitted directly from origin to destination using this path, and the transmission line is freed up after the completion of the transmission.
The technical aspects of a Dynamic Circuit Network can be found in U.S. patent No.: 6,707,820 as well as in the presentation originally done by the author on March 6th, 2008. Moving forward with the research on the DCN and the great capabilities it can facilitate, the author began to wonder if there may be a future for router-less networking. If packet transmission is no longer random and follows a predetermined path, routers would only need be aware of their place in any dynamic circuit created and transmit that data accordingly. They would no longer have to evaluate each packet and determine the best path to forward each one to its next destination. Router-less networking, in its simplest form, simply consists of a cross-over cable between two computers. What if each host in a given geographic area connected to a DC domain, established a virtual circuit to that DC domain? The DC domain relays that virtual circuit to the receiving host, completing the virtual circuit. If all of these entities could be recognized as being in the same segment, then the communication could be created completely without routers. The same as a crossover cable creates a connection between two hosts, that same kind of connection would be created dynamically between two hosts over high-speed fiber optic lines.
Moving even further ahead and discovering very little research information is what this author believes to be the next great emerging technology, even beyond the router-less network: packet-less internet transmission. Some of the greatest emerging technologies have gone back to the past to find their roots. Think of Sky Sails and how they simply modernized utilizing wind to move marine vessels. Packet switching networks provided the means to move data quicker and more reliably than standard analog phone circuits. Now that technology has caught up to the speed requirements of data, we will someday retire packet switching in a way to fully utilize the new dynamic virtual circuits. There will be no need to packetize the data in order to put it on transmission lines, as the data will have the capabilities to be streamed directly from the application layer from host to host. This idea of nullifying the creation and transmission of packets was first introduced to the author in a presentation of the capabilities and future of the Windows Server 2008 Hypervisor for Virtual machines. Currently, each operating system running within a virtual machine must access the network through a software-virtualized network interface. ii Even if the network transmission is not transmitted over any physical lines, it is still broken into packets by the originating operating system and reassembled by the receiving operating system. As the world works towards more server consolidation and the virtualization of servers, the network responsibilities of each guest server will be passed to the host hardware to be transmitted to other hosts. If the Microsoft Engineers are right, and packet-less network traffic can be transferred via this hypervisor that the guest sit upon, why could it not be successfully achieved on a larger scale? Overheads of network virtualization can be overcome using CDNA (Concurrent Direct Network Access), and using this technology to directly access virtual machine NIC drivers via the hypervisor, is really just a small-scale version of what could be created with a large scale, dynamically created circuit. The creator of this circuit would act as a type of hypervisor, from only a network perspective, by having location information of all guests that is under its domain. Machine A makes a request to the DC domain acting as a large scale hypervisor requesting a network to Machine B. The DC domain knows if Machine B is up to receiving requests, and if it is then the DC creates the circuit between Machine A and Machine B and securely manages the transfer at a super high rate of speed. See Exhibit A.

Packet switching and T3 network lines are going to become obsolete in the upcoming years. Internet2 and the Dynamic Circuit Network is going to break out of academia and research and become mainstream, much like Arpanet did. The advantages to be taken of this super high-speed network are going to be plentiful to those brave enough and ingenious enough to take full advantage of it.

Saturday, March 22, 2008

Wharton Chapter 7

Chapter 7 focuses on the marriage of technology and market segmentation. Technology markets are defined as “lumpy”, in that they can be fickle in determining what certain market segments want today and what they will want in the future. Using a very low-level example using laptop computers and two-dimensional variables, they create an envelope defining the technology associated with the two variables. They then create a market segment diagram using three market segments’ utility for the two valuables. It was fascinating to see the result when they overlapped the two envelopes to create an estimate for where they were, and where they may want to be. This could be a determinate as to what technology they may want to improve upon to reach a more diverse market, i.e. multiple market segments. They can also use this information to determine if investments in certain technologies will help them advance their market share.
Everything in this chapter focused on the idea of attributes. Attributes are the differentiating factor between the good of one company to a similar good of another company. These attributes require heavy analysis which can be broken into three components: Basic (expected feature), Discriminators (distinguish between providers), and Energizing features (discriminators that draw a sharp distinction). The authors then explore options to infiltrating these lumpy markets through clever use of a market niche either by combining two niches of two different markets (Fusion), or by advancing their technology barrier to a place where they can offer a superior product to a competitor in a single niche (Single Niche Domination). Or, they can take the most difficult road and create a new technology envelope by essentially revolutionizing the entire attribute set for a technology. An easy way to think about the value of attributes is to think of the history of the automobile. I think of the new Smart Fortwo car as an example of technology and market working together to establish a niche. If Smart had put out the car before a certain market segment was ready for it, it most probably would have failed. On that same premise, had there been a market segment for a car that got 33/41 2008 EPA miles per gallon and was only 8.8 feet long, but Smart did not have the technology investments in the right area, they may have failed to make it to market and attract this niche customer at an attractive price.

Saturday, March 15, 2008

Wharton - Chapter 6

Traditional market research will not cut it in emerging technology. That is the general idea surrounding chapter six in the Wharton book on emerging technologies. This chapter explores the uncertainty and unknown that comes with putting out a technology to market that does not exist. It is not prudent to use the standard focus group or an assumed market to test your product, as these may lead to incorrect or uncompleted information, as well as possibly uncommitted or uninformed testers. The worst thing a company can do is to use a close-minded approach to market research that attempts to direct their product to a specific market or to the market they think they are addressing. They are best to embrace a wide range of possibilities and use a "Triangulation of Insights" and use multiple methods such as using lead users or learning about latent needs or anticipating inflections. Adoption of a product takes the shape of a bell curve, and the forefront of that adoption is the lead users. They are not picky or high-maintenance, just adoptive of the technology as-is and can provide valuable insight into what processes the technology can be valuable for or the needs or requirements the technology may need to further develop. Latent needs are needs that the customer does not even know they have, such as the example that Kimberly-Clark noticed that parents do not want diapers to be waste collectors, but rather "clothing" for their children. And the anticipation of inflection if the high point of the aforementioned bell-curve, which assess the areas of opportunity within a technology where people can evaluate what they may want in a more "futuristic" environment that this technology may address. As Clayton Christensen states in his book "The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail" (Page 50), once a technology is past its point of inflection, and its technology is decreasing at a decreasing rate (I would call this diminishing returns to technology. That's my new term) then a new technology is ready to emerge to supplant that existing one. Anticipation of that point of inflection on the technology S-curve is where the company can gain a comparative advantage ahead of their competition.

Saturday, March 8, 2008

Wharton Chapter 5

This chapter focuses on the role of government and its relationship to emerging technologies. The whole chapter used the history of the Internet and the communications industry as the basis to exemplify ten important lessons. Sometimes the government has a very satisfactory role in emerging technologies, as described by their development of the ARPANet and the privatization of the Internet backbone network from the National Science Foundation. But sometimes the government can take on a non-satisfactory role for a company with additional regulations or the provisioning of a universal service obligation. The idea of a monopoly and how these work was brought up many times in the chapter, including the role for government to regulate natural monopolies where a current market has only room for one competitor and to prevent a company from getting monopoly by stifling competition like Microsoft has been accused of doing, and treating their customers poorly or overcharging can also bring about governmental regulations. Whether a company expects supportive, adversarial, or no governmental effect at all, they should be prepared to deal and adapt to all scenarios with their emerging technologies. With this chapters’ focus on the Internet and its global outreach, very good examples were made of the challenges a company will have with overlapping jurisdictions of federal, state, and local laws, as well as global regulation. The idea of a conduit and content and vertical integration are very important, as to avoid a “closed” architecture that does not allow other companies to compete, the government may become involved to resolve this bottleneck to competition, and thus stifle the company’s ability to vertically integrate. It was interesting to read that most of the lessons (two, three, five, nine and ten) either directly or indirectly point to lobbying for favorable positions for the company’s perspective. Lobbying can be a very effective and powerful negotiator that can lead to substantial success for a company. Would Enron have become the powerhouse it was had it not contributed $572,350 to the campaign of George W. Bush and gotten Kenneth Lay to spend an evening in the White House with Mr. Bush? Even before Mr. Bush came to power, throughout the nineties “…Enron needed help in Washington, and it got it in a series of actions by Congress and the Federal Energy Regulatory Commission (FERC) that undermined the traditional monopoly of utility companies over power plants and transmission lines.” (“Campaign Gifts, Lobbying Built Enron’s Power in Washington”, Washington Post, December 25, 2001). Sometimes Lobbying can give a company the influence it needs to succeed…and beyond.

Saturday, March 1, 2008

Wharton Chapter 4

Chapter four centers it's message around the story of AquaPharm Technologies Corporation, and the different components involved in assessing emerging technologies. Assessing emerging technologies involves four components: Scoping, Searching, Evaluating, and Committing. In scoping the company aligns their strategic intent and their capabilities with the technology they wish to explore. There are many considerations to explore when in the scoping stage, including the target market, financing, R & D options, and organizational culture. Searching includes exploring the different avenues of how and what technologies a company is going to pursue. The company will explore various obvious literature and public information, as well as even private and borderline un-ethical avenues to create a pool of candidate technologies. From the pool of candidate technologies, and select few are chosen for what I perceive to be the most important assessment step: Evaluation. Time or specifications spared at this step could lead to expensive and unnecessary failures of the technology if it goes to market. At this step, the company assesses the various risks involved with the technology. The newer the technology, the more risky it becomes. Once a technology is ultimately decided on, the final step, Commitment, takes place. This is the step where the company chooses a path it will use to introduce the product to market using a variety of strategic intents including: Wait and Watch (not actively develop the product, but actively keep an eye on the technology and/or market), Position and Learn (keeping an option open to develop the technology, and further pursue learning about it), Sense and Follow (active commercialization, but not be first-to-market), and Believe and Lead (full commitment and no fear being first-to-market). Throughout the chapter the different stages are exampled by the story of AquaPharm and their decisions at each step of the assessing process. In their efforts to consistently secure investment capital, they took on more than they could handle, and learned some expensive lessons for their lack of completion of thorough evaluation of technologies. The company ultimately failed and the end of the chapter revealed they may have been a success if they had assessed their technology and their market (their core market, had they focused on it, ultimately became successful) the company would most likely be alive and successfully operating today.

Monday, February 25, 2008

GAPE in the Enterprise....Does it have a future?

"Google Apps in the Enterprise" is a compelling article that dissects both the pros and cons of Google's attempts to to promote their Google Apps Premier Edition to large enterprises. It talks about the various components, including Gmail, Google Docs and Spreadsheets, Google Calendar, Page Creator, Google Talk, and the control panel to administer these in an enterprise environment, and compares the functionality and feasibility of these against against offering found in Microsoft Office or Sharepoint 2007. The author explores the maturity of "Software as a Service" (SaaS) and the pitfalls and potentials of Google entering the Enterprise Content Management industry. The compare the infancy of SaaS to the infancy of the LAN to network PC's together and the early days of electricity (giving credit for AC to George Westinghouse instead of Nikola Tesla...I was offended), and thus recognizing the immense potential for SaaS in the future. They brought up some good points in that Google likes to release software that is "not quite ready" and that a company that adopted GAPE is loses a substantial amount of control of the versions of their applications and may sacrifice some privacy. But a couple other thoughts came to mind in reading this article. Companies that utilize a NAS and use their domain authentication to access files on that NAS would not be able to pass-through domain credentials to access files stored on Google servers. For the common user, this may create some complaints. Also, no matter how simple GAPE would be to use, there would still be a learning curve as people adapted to the interface when we as a society have become so accustomed to Microsoft Office. But, even through these pitfalls and the descriptions of different markets addressed by Google or Oracle or Salesforce.com, this brought memories of the idea of Application Service Providers. providing Applications from a central location that same way ISP's offer Internet connectivity. Although the term "ASP" was not mentioned in the article, this idea has been around for a few years and seems to be the way of the future. The same way IBM was okay to let Microsoft license the OS because "everyone knows the money in is the hardware", maybe now the money is in offering only the software a company needs, when they need it, and only in the amount they need.

Monday, February 18, 2008

Energize Your Teaching..........with MySpace!!!

In reading the blog on "Erergize Your Teaching", a very
clear example was posted, that seems to be the future of social
networking: Facebook and MySpace. I would like to discuss MySpace, as
I have more familiarity to it and it is the largest social network in
the world. Although MySpace started out as a fun way to connect with
friends and even make new ones, it has grown so much larger than that.
It has created a whole new world of web developers, even out of people
who had never seen one line of HTML code and thought Javascript was
the coffee stain on thier shirts. It has created a relationship not
only amongst people, but with their computers. They control how teh
computer protrays them by creating fancy profiles complete with
images, music, and videos. They have begun to understand how a video
from YouTube can be embedded into my page on MySpace. It becomes a
computer version of a person's persona. You can create groups and post
messge and blogs with such ease, there is no reason any of your
"friends" cannot know exactly how you are feeling or what you are
thinking at any giving moment. I can even tell you personnally that if
I am wondering what kind of mood my girlfriend is in, I check her
MySpace page and it tells more accurately than if I asked her!

I believe AOL started this craze, with their revolutionary
"buddy list" and opening up chat with your friends that you could see
were online. To their detriment, they never realized the immeasurable
potential of this technology they implemented, and let it lie dormant.
They fell into Trap Two of the Pitfalls of Emerging Technology that
are discussed by Day and Shoemaker. MySapce is so expandeded now that
it can go to your mobile phone and alert you with updates of
information. If you search for a person on MySpace and they are not
there, it feels like they are not even living in the Information Age.
Socail networking is only going to grow and make a further impediment
on our personal lives. There lies an immense danger in this, as we
have seen with the child predators on MySpace and defamation that can
be posted about an indivisual, such as that teacher in Florida I
believe that was attacked and libeled by some upset students. There is
a lot of liability in social networking, but the potential in how
these threats are handled and the more poeple seems to not enjoy
meeting face-to-face anymore will create a market and a future that is
almost unfathomable.


Personally, I do not like social networking sites such as
MySpace, but to use that idea and contain the technology to more
individualized benefit would make me a believer. I would love to make
"friends" and communicate with people I have a common interest with,
and that thought in-link with the "Energize Your Teaching" topic makes
me think it would be cool if MBA students had a POPULAR site they
could freqeunt and make friends with and discuss topics important to
them....without the risk of getting "friend requests" every day from
solicitors who aren't even real people (you MySpace people know
exactly what I am talking about).

Wharton Ch. 3 - Enlightenment

This chapter really threw me for a loop, and brought me to an enlightenment about emerging technologies. What we sometimes view as a revolutionary technology is not revolutionary at all, but rather evolutionary. Focusing heavily on the examples of xerography, wireless radio transmission, and the Internet, this chapter sought out to explain that a manager of an existing firm need not create the wheel, but rather recognize other application domains that wheel could serve. So often a technology is developed for a sole focused purpose, and it is not until that technology is taken out of it's zone of development is it's true potential realized. For example, the Internet was developed decades before it changed the world. ARPA was developed and used by the miliatry and was only heard of in military and academic circles. But once Netscape developed the first browser, this information transmission mechanism called the "Internet" became a technology of value to the masses. The first two chapters had sucha focus on what not to do in dealing with emerging technologies, this chapter was the first to start explaining what a firm should do. In reading about the Internet and it's roots, I thought of what I believe will be the next big high-speed revolution: Internet2 and the Dynamic Circuit Network. the IP protocol has allowed for vast data transfer all accross the world at relatively high speeds, but the way the packets tranverse the Internet so randomly, it has created a bottleneck to acheiving super high speed and ultra-reliability. Internet2 does not revolutionize the trasnfer of IP traffic at all. Rather it has evolved that technology to provide for a constant, contigeous high speed "Circuit" for the data packets to travel through. Much like the phone lines create a a circuit to circuit connection, IP will operate in the same way. As America and the world become more bandwidth-hungry and such an immense load is put on our routers to direct every packet of IP information, Internet2 will feed this transmission thirst to, in my opinion, move High Definition movies accross the Internet in real time, and a doctor coudl view real-time motion MRI scans from accross the world. The world runs on the Internet, and Internet2 is going to be the new freeway. To quote 3M's slogan, it didn't invent the INternet, it just makes it better.

Wednesday, February 13, 2008

Wharton - Chapter 2: 1 Chapter and 4 Pitfalls

This chapter discusses four "traps" that an established company can fall into when investing in emerging technology. Simply put, they can "wait and see" and let another establishment do the due-diligence of innovation, and then only enter the market if it seems to have a future. This could be tragis because they will lose the early-adopters of that product, and may never be able to catch the innovative firm in market share or technology to gain a strong foothold. This would be a bad mistake. Another trap a company can fall in to would be to stick to their familiar business model and product. This should not happen to a successful company, because they would know that if they do not innovate, they will become obsolete. I liked the books example with Encyclopedia Brittanica, which was a great example. But also if companys only stick to what they know, the natural evolution of an industry could take them down. I think of Union Pacific for this example. A once great railroad and transportation company that did not recognize the innovation that was coming along with the creation of the automobile. With UP's resouces and knowledge of transportation, they could be bigger than Mercedez-Benz if they had got into the market when Benz did. Third, if a company does not fully commit to an emerging echnology, it may be the first thing scratched when their balance sheet starts to look a bit unfavorable. I think this could be attributed a lot to the way American investors value a company. Investment and stock is valuated based on quarterly results, so if a company were to invest in a technology that would pay off immensely in four years at an increased expense now, then investors would bolt. American investors seek immediate gratification and results, so this would leave the comapny no choice than to only make decisions that keep them in the black on their balance sheets. Having survived the first three traps, a company must not fall into the fourth one: not be persistent. they have to realize that sometimes products or technologies hit the market before the market knows it needs them. The comapny must be ready to not pull a profit for many years in order to realize the full potential and success of their product. The book mentions Knight-Ridder as an example of what not to do, and USA Today as a success story. I think of Amazon.com when I read this. They had substancial losses at first, but with persistence they are the online bookstore leader, even over such powerhouses as Barnes and Noble and Borders.
The ways an established comapny can avoid these pitfalls lies in the culture it creates within it's organization. Emphasis and support into "collective learning" and critical thinking in collaboration will allow the company to "think outside the box" and always challenge the things they think they know. From this chapter, I also have a better understanding why large, successful businesses that own other businesses operate them as a completely seperate unit. Although PepsiCo owns Pizza Hut, for example, operating Pizza Hut as its own autonomous unit creates the flexibility and culture that Pizza Hut needs to survive in its market. PepsiCo wsa not made great selling Pizzas, so PepsiCo lets that business unit that knows what they do best operate to their niche. I think PepsiCo and it's subsidiaries are a great example of the right things for an established company do when appraoching emerging technologies.

The Hype Cycle....By Cody

The first thing I noticed was that Tera-Architecture is slated to be 10 years or more to mainstream adoption. I disagree with this statement, in that they say that the first step will be virtualization. Virtualization is not new technology, by anmy means, but it is just starting to gain footing in the mainstream, with EMC's VMWare leading the pack. But, Microsoft is pushing virtualization heavily with their Server 2008 platform, and I believe once this sets, the Tera-Architecture could hit it's 1% market infiltration maybe five years after that. So instead of more than ten years, I would say that seven or eight years would be a better estimation, without conducting further research.
Behavioral economics will flourish as we become less and less prone to privacy, so I do agree with the authors' conclusion. I also agree with them on thier status on Idea Management being five to ten years out. I only think that because I see how the ITSM model has infiltrated the mainstream corporate process and the vast success it hs found. Idea Management will be just a natural progression of progress integration.
The authors give a five to ten year time frame on RFID, and that is something I also disagree with. I believe it will be further out, regardless of Wal-Mart's influence in pushing the technology. The tags are cost-prohibitive, and the radio frequency still has issues with bouncing off of metal and being absorbed by liquid. I recently visited NCR headquarters in Duluth, Georgia and witnessed their development of RFID tags. They were very novel, and with certain items, I really could just push a cart full of groceries though their receiving gate and have the total pop out on a receipt. But there were a lot of errors regarding the aforementioned packaging, and I do not believe the technology will increase enough, and bring the cost down enough, to gain widespread adoption in the next fove to ten years. RSS bar codes are chearp, can hold a trillion bits of information, and are accomodated by most of the scanners on the market today. RSS will win over RFID for the next 20 years.
There were many other interesting notes on the Hype CYcle chart, and you could almost already see the difference since it was posted in July, 2007 and now. Web 2.0 is here and we are already approaching Web 3.0, and location awareness I would say they hit right on the button, so to speak. Very interesting chart in itself, but more than that was the way it advises one to THINK about how they see emerging technologies and lay a framework as to how to approach their market strategy.

Saturday, February 9, 2008

Yahoo Pipes - Check it out

Here is my Yahoo Pipes page, with links to everyone's blogs. Pipes is really sweet. You can basically customize your page to only contain information you want to see from other sites. It is obviously still in Beta, as mine crashed three times. I discoverd it crashed since I have Google pages open under a Google account, and then the pipes page open under my newly created Yahoo account. Crash and burn! But, it could be attributed to IE 7, as it still has inherent issues with a lot of functions since they are still trying to learn how to code their tabbed browsing.
But here is the link to my pipes. CHeck it out if you wish. http://pipes.yahoo.com/pipes/pipe.info?_id=lus6TWjX3BGEHDeDjtzu1g

Have good day!

Energizing Your Teaching....My picks

From "Multiple Ways to Communicate online": I choose Social NetworkingFrom "Use available tech tools to promote collaboration ": I choose "Online Groups"
From "Provide timely feedback to students": I choose Blackboard gradebookFrom : "Reduce student time on task by showing students how to be more efficient": I choose "access files from any computer"
From: "Make students’ work relevant and authentic through opportunities to publicize student work online": I choose Wikis

But, the schedule says for everyone to pick a different one, but there are not seven options per topic. SO are we to only choose 1 subect overall? If that is the case, my favorite pick is "Social Networking".....

Thursday, February 7, 2008

Economic Forum - thoughts

In my opinion, the most innovative products awarded at the World Economic Forum were from InSightec and Skysails. InSightec's ExAblate 2000 takes already incredible technology within the MRI to a new level. At the exact point when issues are visible within the human body, they can be dealt with under closer supervision that a surgeons' eye. It was just fascinating that they are able to combine the diagnosics and treament into one single visit. The addressing of uterine fibroids, for example, is not a sequential treament process, where there are pictures taken, and then action taken after-the-fact. With ExAblate 2000, these two seperate processes are combined into one concurrent treatment. The other product that most impressed me was Skysails' wind propulsion system. What a simple concept, yet so advanced and engenius. A simple sail could potentially set the standard for cross-ocean travel, as oil becomes more expensive and carbon emissions become more regulated. Over two centuries ago, the Mayflower came to America using wind energy, and this new-age approach proces that some form of power truly are ageless. The one company that impressed me the least from the Forum was Kayak. As a faithful Priceline user, I checked out Kayak.com to see if it had any benefits that would make me leave Priceline. The look and feel of it was no different than Expedia, Travelocity, or Priceline, so I decided to test it. On Priceline I could not book a tripto Hawaii as far out as December, 2008. It was simply not available. So i went to Kayak.com, and it allowed me attempt to book the same trip to Hawaii in that December 2008 time frame that Priceline would not. When Kayak.com was attempting to find my trip for me, it got locked in a processing loop that it did not error out on, return a message for me, or simply exit. It just sat there processing. Even though it was a simple test, it left me with no faith in kayak.com, and thus I will continue to use Priceline. Not very innovative, in my opinion.
A sweet emerging technology not discussed in the forum was Hybrid cars. I think they are making more of a statement and impact to the overall global economy than many of the ideas that were awarded. Regardless as to whether they are feasible replacements to our dependance on oil in the future, they are substancial to advertising the impact that our relaince on oil has become. They have become an avenue of new thinking and developing renewable energy sources in an arena that touches every aspect of human nature: transportation. Personally, I will not drive a hybrid right now because I think they are ugly, but I think the technology is pointing society in the right direction, and I think this emerging technology will change the transportation world significantly going into the future.

Wharton book on Emerging Technologies - Beginning

The first thing I noticed about the preface and Chapter one is that it picked up an important topic I remember coving in my Economics course: Mature companies can hinder thier own development of emerging technologies. The authors really elaborated this subject in discussing the various circumstances that can lead a well-funded, well-managed develpopment group to fail if they aare too controoled or follow the tried-and-true thinking of their parent company. I liked how they emphasized that a balance needs to be found between the parnet company and the child company to successfully develop and market a disruptive technology. The parent-teenager analogy made teh concept easy to understand. The child company needs room to explore and grow on it's own basis, but it also needs the right amount of guidance and expertise of the parent.
Emerging technologies need to have goals and milestones to monitor their progress and to keep investors interested, but they must also be flexible to adapt to where their product or market may head.
I remember distinctly one passage that really bothered me, though. The authors mentioned the conflict between Edison and Westinghouse concerning AC and DC electrical current. How I have always understood this imoprtant aspect of the early days of electricity transmission, Nikola Tesla was an employee of Edison's and he was the first to push the idea of Alternating Current. Nikola sold this idea to his good friend Westinghouse, and there the confilcts with Edison continued. On that note, Nikola sold the AC idea to Westinghouse for royalties of $2.50 per horsepower, which he was subsequently cheated out of, even though this technology brought light (literally) to the 1893 World's Fair in Chicago. Nikola was probably the greatest mind in history for the idealogical development of emerginf technologies. But more on that later. Hopefully something I can discuss on the class I get to present.
Although this book states clearly it does not hold the answers for developing successful emerging technologies, as most will fail, it does lay the historical framework for what has worked in the past and what has not. I think it will be a great read for this class.

Monday, February 4, 2008

Human Computation

Very fascinating. Beginning with the basic statistics on the volume hof hours people spend playing solitaire as it relates to the Panama Canal or the Empre State building was a great introduction for Luis to use to begin his presentation. It is interesting that we typically think of utilizing computing power in the technology sense, without realizing that the human brain has capabilities far beyond that of computers. If people can find basic enjoyment in playing a game, that processing and evaluative power can be used to coincide with the processing power of computers and servers on the Internet. The games Luis demonstrated and explained seemed to hold some of the basic fundamentals that have made solitaire so successful: simple, acheivable reward, realxing, and yet with still an element of excitement and connection with other people. Indexing the photos on the Web would seem to be an impossible, daunting task, but to put that task into a game and invite the time and cognitive thinking of thousands (or even millions) of humans makes the possibility certainly tangible. Luis addressed the sight-impaired benfefits of his games, but this technology has advances far beyond that. It is using the human element to address algorithyms that computers are not addressed yet to handle. It furthers the integration of the human brain and the computer processor into waht really could become a Matrix-like world. (as he joked about in his presentation)

The Wall Street Journal Articles - My Take

***Business Solutions; Don't Fence Me In: New security technology doesn't put a firewall around a corporate computer system; Instead, it scans traffic, piece by piece***

Simple article on IDS (Intrusion Detection Systems), and anyone who works within the IT arena of a medium to large business would be familiar with. It goes beyond the basic firewall (still a necessity), to monitor the contents of every packet that crosses into a company's yellow zone. These do require constant montiroing, and can report false-positives (legitimate traffic blocked) unless they are tuned to the specifications of the company. They have matured significantly over hte last several years, and will only get better.



***Technology (A Special Report); Thinking About Tomorrow***

The last sentence of this article pretty much sums up where all of this technology is heading. As mobile phones have evolved to become al-in-one phones, cameras, GPS units, and media centers, the individuality of people will continue to disintegrate. More and more personal information is gathered on people and sometimes they even offer up their most intimate information voluntarily on sits such as facebook.com and myspace.com. With satellite technology, it is possible to pinpoint the location of any person at any given time, and this gives me much angst over my privacy going into the future. People are more connected than ever before and those communication channels have become so cheap to facilitate the communication and store and analyse the date that individuality and privacy will become things of the past. The technology is very cool, and a lot of parts of it are focused on making our lives easier, but how mig with the trade-off be for this convenience?



***Technology (A Special Report); Predictions of the Past: How did we do the last time we looked ahead 10 years? Well, you win some, you lose some***

In 1998 the WSJ polled for the technological changes the world would see in ten years (2008). Some predictions were spot-on (desktop computing power), while some were not so close (the Dutch). I think at the root of this article, yet not stated, was push vs pull of technology. Some technology was developed and then found a market to service. Essentially letting people know they wanted something before they wanted it. But most technolgy comes from a "pull". As the general population became more comfortable and accustomed to developing technology, they started finding new ways they wanted to use technology, and the developers answered that call. There has been a big idea of "listen to what the poeple want and give it to them, and then some". It is almost amazing the accuracy that some poeple predicted ten years out, and leaves us to wonder what the world will be like come 2018.