Total Pageviews

Saturday, November 27, 2010

Jimi Hendrix

Jimi Hendrix at the Woodstock Music and Art Fair, 1969.
[Credit: Henry Diltz/Corbis]Jimi Hendrix, byname of James Marshall Hendrix, originally John Allen Hendrix  (b. Nov. 27, 1942, Seattle, Wash., U.S.—d. Sept. 18, 1970, London, Eng.), American rock guitarist, singer, and composer who fused American traditions of blues, jazz, rock, and soul with techniques of British avant-garde rock to redefine the electric guitar in his own image.

Jimi Hendrix.
[Credit: Lipnitzki—Roger Viollet/Getty Images]Though his active career as a featured artist lasted a mere four years, Hendrix altered the course of popular music and became one of the most successful and influential musicians of his era. An instrumentalist who radically redefined the expressive potential and sonic palette of the electric guitar, he was the composer of a classic repertoire of songs ranging from ferocious rockers to delicate, complex ballads. He also was the most charismatic in-concert performer of his generation. Moreover, he was a visionary who collapsed the genre boundaries of rock, soul, blues, and jazz and an iconic figure whose appeal linked the concerns of white hippies and black revolutionaries by clothing black anger in the colourful costumes of London’s Carnaby Street.

A former paratrooper whose honourable medical discharge exempted him from service in the Vietnam War, Hendrix spent the early 1960s working as a freelance accompanist for a variety of musicians, both famous and obscure. His unorthodox style and penchant for playing at high volume, however, limited him to subsistence-level work until he was discovered in a small New York City club and brought to England in August 1966. Performing alongside two British musicians, bassist Noel Redding and drummer Mitch Mitchell, he stunned London’s clubland with his instrumental virtuosity and extroverted showmanship, numbering members of the Beatles, the Rolling Stones, and the Who among his admirers. It proved a lot easier for him to learn their tricks than it was for them to learn his.

Hendrix had an encyclopaedic knowledge of the musical roots on which the cutting-edge rock of his time was based, but, thanks to his years on the road with the likes of Little Richard and the Isley Brothers, he also had hands-on experience of the cultural and social worlds in which those roots had developed and a great admiration for the work of Bob Dylan, the Beatles, and the Yardbirds. Speedily adapting the current musical and sartorial fashions of late 1966 London to his own needs, he was soon able not only to match the likes of the Who at their own high-volume, guitar-smashing game but also to top them with what rapidly became the hottest-ticket show in town.

By November his band, the Jimi Hendrix Experience, had their first Top Ten single, “Hey Joe.” Two more hits, “Purple Haze” and “The Wind Cries Mary,” followed before their first album, Are You Experienced?, was released in the summer of 1967, when it was second in impact only to the Beatles’ Sgt. Pepper’s Lonely Hearts Club Band. Its immediate successor, Axis: Bold as Love, followed that December. On Paul McCartney’s recommendation, Hendrix was flown to California for a scene-stealing appearance at the Monterey Pop Festival, which rendered him a sensation in his homeland less than a year after his departure.
Relocating back to the United States in 1968, he enjoyed further acclaim with the sprawling, panoramic double album Electric Ladyland, but the second half of his career proved frustrating. Legal complications from an old contract predating his British sojourn froze his recording royalties, necessitating constant touring to pay his bills; and his audiences were reluctant to allow him to progress beyond the musical blueprint of his earliest successes. He was on the verge of solving both these problems when he died of an overdose of barbiturates, leaving behind a massive stockpile of works-in-progress that were eventually edited and completed by others.

For Hendrix, the thunderous drama of his hard rock band was but a fraction of what he aspired to: he wanted to compose more complex music for larger ensembles, rather than simply to improvise endlessly in front of a rhythm section for audiences waiting for him to smash or burn his guitar. Nevertheless, in his all-too-brief career, he managed to combine and extend the soaring improvisational transcendence of John Coltrane, the rhythmic virtuosity of James Brown, the bluesy intimacy of John Lee Hooker, the lyrical aesthetic of Bob Dylan, the bare-knuckle onstage aggression of the Who, and the hallucinatory studio fantasias of the Beatles. Hendrix’s work provides a continuing source of inspiration to successive generations of musicians to whom he remains a touchstone for emotional honesty, technological innovation, and an all-inclusive vision of cultural and social brotherhood.

Bruce Lee

Bruce Lee, Chinese name Li Jun Fan  (b. Nov. 27, 1940, San Francisco, Calif., U.S.—d. July 20, 1973, Hong Kong), American-born film actor who was renowned for his martial arts prowess and who helped popularize martial arts movies in the 1970s.

Lee was born i
n San Francisco, but he grew up in Hong Kong. He was introduced to the entertainment industry at an early age, as his father was an opera singer and part-time actor. The younger Lee began appearing in films as a child and was frequently cast as a juvenile delinquent or street urchin. As a teenager, he took up with local gangs and began learning kung fu in order to better defend himself. At this time he also started dance lessons, which further refined his footwork and balance; in 1958 Lee won the Hong Kong cha-cha championship.

Lee’s parents were increasingly disturbed by his street fighting and run-ins with the police, and they sent him to live in the United States shortly after he turned 18. He lived with family friends in Seattle, where he finished high school and studied philosophy and drama at the University of Washington. While in Seattle, he opened his first martial arts school, and in 1964 he relocated to Oakland, Calif., to found a second school. It was about this time that he developed his own technique, jeet kune do—a blend of ancient kung fu and philosophy—which he began teaching instead of traditional martial arts. He drew the attention of a television producer after giving a kung fu demonstration at a Los Angeles-area karate tournament, and he was cast as the sidekick Kato in the television series The Green Hornet (1966–67).

Lee had difficulty finding acting jobs after the cancellation of The Green Hornet, and he began supplementing his income by giving private jeet kune do lessons to Hollywood stars, including Steve McQueen. In the 1969 film Marlowe, Lee received notice for a scene in which he destroyed an entire office through kickboxing and karate moves. Troubled by his inability to find other suitable roles, however, he moved back to Hong Kong in 1971. There Lee starred in two films that broke box-office records throughout Asia, and he later found success in the United States with Tang shan da xiong (1971; Fists of Fury [U.S.], or The Big Boss [Hong Kong English title]) and Jing wu men (1972; The Chinese Connection [U.S.], or Fist of Fury [Hong Kong English title]).

Lee used his sudden box-office clout to form his own production company, and he coproduced, directed, wrote, and starred in his next film, Meng long guo jiang (1972; Return of the Dragon [U.S.], or The Way of the Dragon [Hong Kong English title]). Lee’s following film, Enter the Dragon (1973), was the first joint venture between Hong Kong- and U.S.-based production companies, and it became a worldwide hit, thrusting Lee into international movie stardom. Tragically, he died six days before the film’s Hong Kong release. The mysterious circumstances of his death were a source of speculation for fans and historians, but the cause of death was officially listed as swelling of the brain caused by an allergic reaction to a headache medication. At the time, Lee had been working on a film called Game of Death, which was pieced together with stand-ins and cardboard cutouts of Lee’s face and was released in 1978.

After Lee’s death, his films gained a large cult following. Lee himself became one of the biggest pop culture icons of the 20th century, and he is often credited with changing the way Asians were presented in American films. A slightly fictionalized biopic, Dragon: The Bruce Lee Story, appeared in 1993. His son, Brandon, followed Lee into acting, and he died after being shot with a misloaded prop gun while filming The Crow (1994).

Friday, November 26, 2010

Vodafone

Vodafone, telecommunications company based in the United Kingdom with interests in Europe and the United States. It originated as part of Racal, a British radar and electronics firm founded in 1950. Racal founded its Vodafone subsidiary in 1983 and won the license to build Britain’s first cellular telephone network, which was launched in 1985. By the early 1990s Vodafone was purchasing other companies and building network partnerships around the world.

The company roughly doubled its size in 2000 by acquiring German industrial conglomerate Mannesmann AG. Founded as Mannesmannroehren-Werke in 1890 by Reinhard Mannesmann (1856–1922), the German company had become a leading manufacturer of steel tubing and by the 1930s emerged as one of the six giant iron and steel works of the Ruhr. Although Mannesmann executives were not among the German industrialists who promoted the rise of Adolf Hitler, the company did contribute significantly to the war effort and therefore was stripped of nearly all its directors under the terms of the war-crimes mandate. Following reorganization of Germany’s basic industries according to Allied occupation policy, Mannesmann emerged as an independent company in 1952 and conducted business in Brazil, Canada, Argentina, Austria, and other European countries in industries such as transportation and telecommunications.

Vodafone’s hostile bid for Mannesmann resulted in what was then the largest merger in the world. In the process Vodafone had become one of the world’s leading providers of mobile telecommunications services, and it conducted business in more than 30 countries.

Tata Group

http://www.topnews.in/files/Tata-Motors19.jpg
Tata Group chairman Ratan Tata next to the newly launched Tata Nano at the 9th Auto Expo in New …
[Credit: Saurabh Das/AP]Tata Group, privately owned conglomerate of nearly 100 companies encompassing several primary business sectors: chemicals, consumer products, energy, engineering, information systems, materials, and services. Headquarters are in Mumbai.

The Tata Group was founded as a private trading firm in 1868 by entrepreneur and philanthropist Jamsetji Nusserwanji Tata. In 1902 the group incorporated the Indian Hotels Company to commission the Taj Mahal Palace & Tower, the first luxury hotel in India, which opened the following year. After Jamsetji’s death in 1904, his son Sir Dorab Tata took over as chair of the Tata Group. Under Dorab’s leadership the group quickly diversified, venturing into a vast array of new industries, including steel (1907), electricity (1910), education (1911), consumer goods (1917), and aviation (1932).

Following Dorab’s death in 1932, Sir Nowroji Saklatwala became the group’s chair. Six years later Jehangir Ratanji Dadabhoy Tata (J.R.D.) took over the position. His continued expansion of the company into new sectors—such as chemicals (1939), technology (1945), cosmetics (1952), marketing, engineering, and manufacturing (1954), tea (1962), and software services (1968)—earned Tata Group international recognition. In 1945 Tata Group established the Tata Engineering and Locomotive Company (TELCO) to manufacture engineering and locomotive products; it was renamed Tata Motors in 2003. In 1991 J.R.D.’s nephew, Indian business mogul Ratan Naval Tata, succeeded him as chairman of the Tata Group. Upon assuming leadership of the conglomerate, Ratan aggressively sought to expand it, and increasingly he focused on globalizing its businesses. In 2000 the group acquired London-based Tetley Tea, and in 2004 it purchased the truck-manufacturing operations of South Korea’s Daewoo Motors. In 2001 Tata Group partnered with American International Group, Inc. (AIG) to create the insurance company Tata-AIG.

In 2007 Tata Steel completed the biggest corporate takeover by an Indian company when it acquired the giant Anglo-Dutch steel manufacturer Corus Group. The following year the company made headlines worldwide when it ventured into the automotive industry. On Jan. 10, 2008, Tata Motors officially launched the Nano, a tiny, rear-engine, pod-shaped vehicle that eventually sold at a base price (excluding options, tax, and transportation fees) equivalent to $1,500 to $3,000. Although only slightly more than 3 metres (10 feet) long and about 1.5 metres (5 feet) wide, the highly touted “People’s Car” could seat up to five adults and, in Tata’s words, would provide a “safe, affordable, all-weather form of transport” for millions of middle- and lower-income consumers both in India and abroad. The first Nano hit the road in India in July 2009. Tata Motors purchased the elite British brands Jaguar and Land Rover from the Ford Motor Company in 2008.

Thursday, November 25, 2010

Latest Website Design Trends For 2010

Well, 2010 has finally arrived, and with it are new, innovative Website designs that will usher in the brave, new look of tomorrow's online marketplace. There will be Web design trends that people will love and some they hate, and what is hot at the beginning of the year may be cold by year's end. Additionally, the trend doesn't begin and end on January 1st; there is a definite shift in ideas and trends as the year progresses, albeit a subtle one, as new ideas emerge and become more streamlined.

The trends that evolve help Web designers evolve as well, allowing them to master and refine their skills and to reach new heights of creativity and discovery. The team of Web designers at Active Web Group have identified the Website designs that we feel will be the hottest trends for 2010.

The Web design trends Active Web Group outline below mark a different trend view from 2009, but not a drastic one. Contact Active Web Group to incorporate any of these new and latest design trends into your custom designed Website.

Oversized Logos/Headers
Oversized logos with an oversized header are one trend that is already growing in popularity, and will likely populate many newly designed or redesigned websites in 2010. The main objective behind a huge header is to increase brand recognition and leave a lasting impression. They are intended to take over a significant portion of the splash page, enticing visitors to scroll down.

Sketches/Hand-drawn Designs
Sketching or Hand-drawn designs in 2010 will become not so much the main focus of a Web design, but rather a method used to personalize standard web copy and thus become an elemental part of corporate design. The sketch will help distinguish between a cold Web site and personal interaction.

Slab Typefaces
While slab typefaces have been around for quite some time, they are just now gaining important significance in Website design. Slab typefaces are usually all capital letters and are bold and imposing. They go hand in hand with large headers and can help express who you are as a company.

Typography
Typography was a big trend in 2009 and will quite possibly remain so in 2010. Websites utilizing Typography as their main design element may be more interesting to a reader than the same site with a large amount of images.

One Page Layouts
The development of one page layouts in 2010 will focus on personal profiles with a reduced corporate influence. Think online business cards. This Web design will focus on the individual, their blog, social media hangouts, etc.

Huge Images
A huge image is about creating a visual impact that the visitor will not forget, similar to the oversized logo/header. They are designed to draw the visitor further into the Website

Change of Perspective
The change in perspective to a more realistic view will mark a definitive trend in 2010. Playing around with different perspectives, such as a side-shot aerial, may be a particular element that finds its way into the web design mix.

Interactive/Intuitive Design
The development of Websites with Flash has come a long way, with the advent of swfobject2.2, sIFR and other Flash technologies that enable it to be more accessible to major search engines. 2010 will see Web designers move towards some of the more redeeming elements of Flash. Since the average visitor is now more Web savvy than in previous years, designers will begin to create sites that are more intuitive and interactive.


Modal Boxes
In 2010, modal boxes are picking up steam. They are similar to a popup but much more engaging and less intrusive. Modal boxes are easy to design and use, making them a great solution for designers due to their ease of usability.

Minimalism
One of the trends Active Web Group sees coming in 2010 are Websites featuring loads of white space, bold typology and different color schemes. Minimalism will showcase some fresh colors that bring forth warmth and websites that are primarily focused in their delivery on information.

Oversized Footers
While we have seen only a small amount of Websites with oversized footers thus far, we believe 2010 will see this web design feature become an integral part of the overall design of the Website. Footers will highlight such features as: feed updates from social media outlets, photo and video feeds and more.

Retro
In an attempt to honor vintage art, 2010 will find Web designers turning to retro design. While a site that is done in retro might seem incomplete, the key to a successful retro design should be a focus on an inspirational tone and 'playfulness'.

Intro Boxes
The attraction towards the development of an intro box on a Website in 2010 is the simplicity of introducing yourself to the visitor instead of struggling with the development of a creative 'About Us' page.

Magazine Layouts
The magazine layout in 2010 has come in part due to the migration from the traditional press to online infotainment. The development of a magazine layout gives the visitor all of the information they are seeking on one home page; this gives the visitor the opportunity to view everything quickly and at their convenience.
These are just a few of the most important Web designs Active Web Group believes will be the major trends in Website design for 2010.

The Next Microsoft Windows Operating System?

Microsoft recently released Windows 7 on Oct. 22, 2009 and the rumors for the Next Windows operating system have already started (Windows 8 / Windows.next). I'm still confused, but a Microsoft employee claims that it's the real deal. You be the judge.

Nearly four months after the release of Windows 7 (and its over 60 million copies sold) the internet is already buzzing with Windows 8 and now Windows.next rumors. Recently an anonymous blogger that allegedly works for Microsoft, sort of spilled the beans in an MSDN blog titled "What's in store for the next Windows?" by describing the next Windows operating system as being "different from what folks usually expect of Windows," and went on to call the next Windows operating system "Windows.next" (while most bloggers and rumor sites were calling it Windows 8).

Usually us techies wouldn't go off of random rumors by anonymous bloggers, but judging by the speed in which Microsoft quickly had the blog post purged from the internet, the pictures displayed of the next Windows project timeline, and the similar descriptions given by Microsoft's Regional Vice President John Mangelaars about Windows 8 ("Windows 8 will be mind-blowing"), it's safe to say that this blogger might have been the real deal and the next Windows operating system will be called Windows.next.

Windows.next is rumored to be released in 2012 and is still being called Windows 8 by rumor sites and bloggers, but soon you might be seeing the name Windows.next popping up all over the place. Whatever Windows.next has to offer has to be really mind-blowing and totally out of the realm of normal for someone to risk losing a cool job like working for Microsoft (and the judicial beatings that comes with leaking Microsoft's top secret info).

Tuesday, November 23, 2010

Help Bing Donate $1 Million to Schools

At the time of this article the total amount that Bing managed to give to schools through the new Bing Million Dollar Giving Challenge was just $30,000.
It's a long way to go to reach $1 million, and Microsoft is inviting users to help it donate money to schools in the United States.

The Redmond company has partnered with Donorschoose.org for the new Bing Million Dollar Giving Challenge, and is looking to attract as many people as possible to the initiative.

Essentially, the software giant is letting users donate its money, and of course, people can help spread Microsoft’s wealth where they see fit.

The Bing Gives website was set up to involve the audience, and to streamline donations. According to the Redmond company, users have two options at their disposal.

“You can opt to have Bing donate $1 on your behalf to DonorsChoose.org with just one click. Or, receive a $5 donation code that you can apply to the classroom project of your choice posted on the DonorsChoose.org website when you make Bing your homepage,” revealed a member of the Bing team.

With your help, Bing is looking to give $1 million to US schools by the end of this year. It’s Microsoft’s way of bringing a little holiday cheer to students in the US, while raising awareness of the problems faced by organizations in the education system, and also advertising its search/decision engine, Bing.

“The Bing One Million Dollar Giving Challenge is in partnership with DonorsChoose.org, a long-term partner of Bing that connects individuals with the schools and classrooms that they care about most,” a Bing team member added.

“Bing believes every child deserves a great education. In partnership with DonorsChoose.org, Bing has supported nearly half a million students and raised over $700,000 for deserving schools. This holiday season, our goal is to give even more – up to one million dollars with your help.”

Lync Server 2010 Developer Training Kit

As is the case with a range of Microsoft offerings, Lync Server 2010 is not only a new unified communication solution, but also a platform which can be leveraged by developers.
At the same time, the software giant is offering devs additional resources beyond the Lync SDK, in order to simplify development for the new platform. Case in point: the UC "14" Developer Training Kit.

The training kit is in fact focused on the SDKs for Lync Server 2010 and Exchange Server 2010, offering as the official label of the download states, training resources designed to allow dev to expand on and build custome unified communications experiences.

“Microsoft Lync Server 2010 and Microsoft Exchange Server 2010 provide the communication platform for developers to rapidly build solutions that integrate and extend communications into their business processes based on familiar tools and skills,” revealed Bruce D. Kyle, ISV Architect Evangelist | Microsoft.

“This Unified Communications “14” Training Kit provides deep technical training on all aspects of the Lync Server 2010 and Exchange Server 2010 SDKs to give developers the skills they need to be productive developing communications driven business processes.”

There are no less than nine modules offered via the training kit for Lync 2010. In this regard, developers will be able to take advantage of not only a presentation, but also a hands-on lab put together by Microsoft.

The Training Kit will enable devs using the SDK to create custom communication experiences on top of the Lync 2010 platform, including by embedding presence and communications into applications.

Kyle added that “this training kit includes the following modules:

1.Getting Started with Microsoft Lync Server 2010 and Exchange Server 2010 Development

2.Integrating Microsoft Lync 2010 Features with the Lync Controls

3.Building Contextual Conversations with the Microsoft Lync 2010 Managed API

4.Building Communications Clients with the Lync 2010 Managed API

5.Web Services Managed API 1.0

6.Getting Started with Unified Communications Managed API 3.0

7.Building Communications Workflows with UCMA 3.0 Workflow SDK

8.Building Advanced Communications Solutions with UCMA 3.0

9.Lync Server 2010 and Exchange Server 2010: Architecture and Deployment.”

Saturday, November 20, 2010

International Business Machines Corporation (IBM)

Garry Kasparov playing against Deep Blue, the chess-playing computer built by IBM.
[Credit: Adam Nadel/AP]International Business Machines Corporation (IBM), leading American computer manufacturer, with a major share of the market both in the United States and abroad. Its headquarters are in Armonk, N.Y.

It was incorporated in 1911 as the Computing-Tabulating-Recording Company in a consolidation of three smaller companies that made punch-card tabulators and other office products. The company assumed its present name in 1924 under the leadership of Thomas Watson, a man of considerable marketing skill who became general manager in 1914 and had gained complete control of the firm by 1924. Watson built the then-floundering company into the leading American manufacturer of punch-card tabulating systems used by governments and private businesses. He also developed a highly disciplined and competitive sales force that adapted the company’s custom-built tabulating systems to the needs of particular customers.

An IBM 650 computer system, c. 1954
[Credit: IBM Archives]In 1933 IBM purchased Electromatic Typewriters, Inc., and thereby entered the field of electric typewriters, in which it eventually became an industry leader. During World War II, IBM helped construct several high-speed electromechanical calculators that were the precursors of electronic computers. But the firm refrained from producing these electronic data-processing systems until Watson’s son, Thomas Watson, Jr., became president of the company in 1952 and sponsored an all-out push into that field. Having entered the computer field, IBM’s size allowed it to invest heavily in development. This investment capability, added to its dominance in office-calculating machines, its marketing expertise, and its commitment to repair and service its own equipment, allowed IBM to quickly assume the predominant position in the American computer market. By the 1960s it was producing 70 percent of the world’s computers and 80 percent of those used in the United States.

The IBM Personal Computer (PC) was introduced in 1981.
[Credit: IBM Archives]IBM’s specialty was mainframe computers—i.e., expensive medium- to large-scale computers that could process numerical data at great speeds. The company did not enter the growing market for personal computers until 1981, when it introduced the IBM Personal Computer. This product achieved a major share of the market, but IBM was nevertheless unable to exercise its accustomed dominance as a maker of personal computers. New semiconductor-chip–based technologies were making computers smaller and easier to manufacture, allowing smaller companies to enter the field and exploit new developments such as workstations, computer networks, and computer graphics. IBM’s enormous size hindered it from responding rapidly to these accelerating rates of technological change, and by the 1990s the company had downsized considerably. In 1995 IBM purchased Lotus Development Corporation, a major software manufacturer.

In 2002 IBM sold its magnetic hard drive business for $2.05 billion to the Japanese electronics firm of Hitachi, Ltd. Under the terms of the sale, IBM agreed to continue producing hard drives with Hitachi for three years in a joint venture known as Hitachi Global Storage Technologies. In 2005 Hitachi took full control of the joint venture and IBM stopped building a device that it had invented in 1956. In December 2005 IBM sold its personal computer division to the Lenovo Group, a major Chinese manufacturer. In addition to cash, securities, and debt restructuring, IBM acquired an 18.9 percent stake in Lenovo, which acquired the right to market its personal computers under the IBM label through 2010. With these divestitures, IBM shifted away from manufacturing so-called commodity products in order to concentrate on its computer services, software, supercomputer, and scientific research divisions.

Since 2000, IBM has placed one of its supercomputers consistently at or near the top of the industry’s list of most powerful machines as measured by standardized computation tests. In addition to producing supercomputers for governments and large corporations, IBM’s supercomputer division, in cooperation with the Toshiba Corporation and the Sony Corporation of Japan, designed the Cell Broadband Engine.

Developed over a four-year period beginning in 2001, this advanced computer chip has multiple applications, from supercomputers to Toshiba high-definition televisions to the Sony Playstation 3 electronic game system. IBM also designed the computer chips for the Microsoft Corporation Xbox 360 and the Nintendo Company Wii game systems. IBM became the first company to generate more than 3,000 patents in one year (2001) and, later, more than 4,000 patents in one year (2008). The company now holds more than 40,000 active patents, which generate considerable income from royalties.

Wikipedia

Wikipedia, free, Internet-based encyclopaedia operating under an open-source management style. It is overseen by the nonprofit Wikimedia Foundation. Wikipedia uses a collaborative software known as wiki that facilitates the creation and development of articles. The English-language version of Wikipedia began in 2001. It had more than one million articles by March 2006 and more than two million by September 2007, and it continues to grow at a rate of millions of words per month. Much of its content treats popular culture topics not covered by traditional encyclopaedias. Wikipedia is also an international project with versions in scores of languages, including French, German, Polish, Dutch, Hebrew, Chinese, and Esperanto. Although some highly publicized problems have called attention to Wikipedia’s editorial process, they have done little to dampen public use of the resource.

In 1996 Jimmy Wales, a successful bond trader, moved to San Diego, Calif., to establish Bomis, Inc., a Web portal company. In March 2000 Wales founded Nupedia, a free online encyclopaedia, with Larry Sanger as editor-in-chief. Nupedia was organized like existing encyclopaedias, with an advisory board of experts and a lengthy review process. By January 2001, fewer than two dozen articles were finished, and Sanger advocated supplementing Nupedia with an open-source encyclopaedia. On Jan. 15, 2001, Wikipedia was launched as a feature of Nupedia.com, but, following objections from the advisory board, it was relaunched as an independent Web site a few days later. In its first year, Wikipedia expanded to some 20,000 articles in 18 languages. In 2003 Nupedia was terminated and its articles moved into Wikipedia.

In some respects, Wikipedia’s open-source production model is the epitome of the so-called Web 2.0, an egalitarian environment where the web of social software enmeshes users in both their real and virtual-reality workplaces. The Wikipedia community is based on a limited number of standard principles. One important principle is neutrality; another is the faith that contributors are participating in a sincere and deliberate fashion. Readers can correct what they perceive to be errors, and disputes over facts and possible bias are conducted through contributor discussions, with Wales as the final arbiter. Three other “pillars of wisdom” are: not to use copyrighted material, not to contribute original research, and not to have any other rules. The last pillar reinforces the project’s belief that the open-source process will make Wikipedia into the best product available, given its community of users.

The central policy of inviting readers to serve as authors or editors creates the potential for problems as well as their partial solution. Not all users are scrupulous about providing accurate information, and Wikipedia must also deal with individuals who deliberately deface particular articles, post misleading or false statements, or add obscene material. Wikipedia’s method is to rely on its users to monitor and clean up its articles. Trusted contributors can also receive administrator privileges that provide access to an array of software tools to fix Web graffiti and other serious problems speedily.

Reliance on community self-policing has generated some problems. In 2005 the American journalist John Seigenthaler, Sr., discovered that his Wikipedia biography falsely identified him as a potential conspirator in the assassinations of both John F. Kennedy and Robert F. Kennedy and that these malicious claims had survived Wikipedia’s community policing for 132 days. The author of this information could not be easily identified, since all that is known about contributors is their computers’ IP, or Internet protocol, addresses (many of which are dynamically generated each time a user goes online). (The contributor later confessed and apologized, saying that he wrote the false information as a joke.) Wikipedia administrators have the power to block particular IP addresses—a power they used in 2006 after it was found that staff members of some U.S. congressional representatives had altered articles to eliminate unfavourable information.

News of such self-interested editing inspired Virgil Griffith, a graduate student at the California Institute of Technology, to create Wikipedia Scanner, or WikiScanner, in 2007. By first downloading the entire Wikipedia corpus, Griffith was able to view all of the edits made to its articles and the IP addresses where the edits originated. He then correlated these addresses with their owners (individuals, corporations, and government entities) to create a database that he made available on the Web for anyone to search through. He and other researchers quickly discovered that editing from computers located in corporations and government offices was widespread. Although most of the edits were innocuous—typically, individuals working on subjects unrelated to their positions—a pattern did seem to emerge of many articles being edited to reflect more favourably on the editors’ hosts. Articles on political subjects in particular have become the greatest test of Wikipedia’s principle of neutrality. To help on this front, Griffith released another tool, WikiGanda, in 2008. The new database documents some of the site’s “edit wars,” or propaganda battles, such as the intermittent efforts of neo-Nazis to rewrite Wikipedia’s articles on the Holocaust.

For many observers of these controversies, a troubling difference between Wikipedia and other encyclopaedias lies in the absence of editors and authors who will accept responsibility for the accuracy and quality of their articles. These observers point out that identifiable individuals are far easier to hold accountable for mistakes, bias, and bad writing than is a community of anonymous volunteers, but other observers respond that it is not entirely clear if there is a substantial difference. Regardless of such controversies—perhaps in part because of them—Wikipedia has become a model of what the collaborative Internet community can and cannot do.

Friday, November 19, 2010

Intel Corporation

A detail of the Intel Desktop Board D915GUX. The primary circuit board connects all the basic …
[Credit: Copyright Intel Corporation]Intel Corporation, American manufacturer of semiconductor computer circuits. It is headquartered in Santa Clara, Calif. The company’s name comes from “integrated electronics.”

Intel was founded in July 1968 by American engineers Robert Noyce and Gordon Moore. Unlike the archetypal Silicon Valley start-up business with its fabled origins in a youthful founder’s garage, Intel opened its doors with $2.5 million in funding arranged by Arthur Rock, the American financier who coined the term venture capitalist. Intel’s founders were experienced, middle-aged technologists who had established reputations. Noyce was the coinventor in 1959 of the silicon integrated circuit when he was general manager of Fairchild Semiconductor, a division of Fairchild Camera and Instrument. Moore was the head of research and development at Fairchild Semiconductor. Immediately after founding Intel, Noyce and Moore recruited other Fairchild employees, including Hungarian-born American businessman Andrew Grove. Noyce, Moore, and Grove served as chairman and chief executive officer (CEO) in succession during the first three decades of the company’s history.

Intel’s initial products were memory chips, including the world’s first metal oxide semiconductor, the 1101, which did not sell well. However, its sibling, the 1103, a one-kilobit dynamic random-access memory (DRAM) chip, was successful and the first chip to store a significant amount of information. It was purchased first by the American technology company Honeywell Incorporated in 1970 to replace the core memory technology in its computers. Because DRAMs were cheaper and used less power than core memory, they quickly became the standard memory devices in computers worldwide.

Following its DRAM success, Intel became a public company in 1971. That same year Intel introduced the erasable programmable read-only memory (EPROM) chip, which was the company’s most successful product line until 1985. Also in 1971 Intel engineers Ted Hoff, Federico Faggin, and Stan Mazor invented a general-purpose four-bit microprocessor and the first single-chip microprocessor, the 4004, under contract to the Japanese calculator manufacturer Nippon Calculating Machine Corporation, which let Intel retain all rights to the technology.

Not all of Intel’s early endeavours were successful. In 1972 management decided to enter the growing digital market by purchasing Microma. But Intel had no real understanding of consumers and sold the watchmaking company in 1978 at a loss of $15 million. In 1974 Intel controlled 82.9 percent of the DRAM chip market, but, with the rise of foreign semiconductor companies, the company’s market share dipped to 1.3 percent by 1984. By that time, however, Intel had shifted from memory chips and become focused on its microprocessor business: in 1972 it produced the 8008, an eight-bit central processing unit (CPU); the 8080, which was 10 times faster than the 8008, came two years later; and in 1978 the company built its first 16-bit microprocessor, the 8086.

In 1981 the American computer manufacturer International Business Machines (IBM) chose Intel’s 16-bit 8088 to be the CPU in its first mass-produced personal computer (PC). Intel also provided its microprocessors to other manufacturers that made PC “clones” that were compatible with IBM’s product. The IBM PC and its clones ignited the demand for desktop and portable computers. IBM had contracted with a small firm in Redmond, Wash., Microsoft Corporation, to provide the disk operating system (DOS) for its PC. Eventually Microsoft supplied its Windows operating system to IBM PCs, which, with a combination of Windows software and Intel chips, were dubbed “Wintel” machines and have dominated the market since their inception.

Of the many microprocessors Intel has produced, perhaps the most important was the 80386, a 32-bit chip released in 1985 that started the company’s commitment to make all future microprocessors backward-compatible with previous CPUs. Application developers and PC owners could then be assured that software that worked on older Intel machines would run on the newest models.

With the introduction of the Pentium microprocessor in 1993, Intel left behind its number-oriented product naming conventions for trademarked names for its microprocessors. The Pentium was the first Intel chip for PCs to use parallel, or superscalar, processing, which significantly increased its speed. It had 3.1 million transistors, compared with the 1.2 million transistors of its predecessor, the 80486. Combined with Microsoft’s Windows 3.x operating system, the much faster Pentium chip helped spur significant expansion of the PC market. Although businesses still bought most PCs, the higher-performance Pentium machines made it possible for consumers to use PCs for multimedia, graphical applications such as games like Doom and Wing Commander that required more processing power.

Moore’s law
[Credit: Encyclopædia Britannica, Inc.]Intel’s business strategy relied on making newer microprocessors dramatically faster than previous ones to entice buyers to upgrade their PCs. One way to accomplish this was to manufacture chips with vastly more transistors in each device. For example, the 8088 found in the first IBM PC had 29,000 transistors, while the 80386 unveiled four years later included 275,000, and the Core 2 Quad introduced in 2008 had more than 800,000,000 transistors. The Tukwila, scheduled for production in 2010, is designed to have 2,000,000,000 transistors. This growth in transistor count became known as Moore’s law, named after company cofounder Gordon Moore, who observed in 1965 that the transistor count on a silicon chip would double approximately annually; he revised it in 1975 to a doubling every two years.

In order to increase consumer brand awareness, in 1991 Intel began subsidizing computer advertisements on the condition that the ads included the company’s “Intel inside” label. Under the cooperative program, Intel set aside a portion of the money that each computer manufacturer spent annually on Intel chips, from which Intel contributed half the cost of that company’s print and television ads during the year. Although the program directly cost Intel hundreds of millions of dollars each year, it had the desired effect of establishing Intel as a conspicuous brand name.

An Intel® Pentium® 4 processor (detail of die photo) contains more than 40 million …
[Credit: © Intel Corporation]Intel’s famed technical prowess was not without mishaps. Its greatest mistake was the so-called “Pentium flaw,” in which an obscure segment among the Pentium CPU’s 3.1 million transistors performed division incorrectly. Company engineers discovered the problem after the product’s release in 1993 but decided to keep quiet and fix the problem in updates to the chip. However, mathematician Thomas Nicely of Lynchburg College in West Virginia also discovered the flaw. At first Grove (then CEO) resisted requests to recall the product. But when IBM announced it would not ship computers with the CPU, it forced a recall that cost Intel $475 million.

Although bruised by the Pentium fiasco, the combination of Intel technology with Microsoft software continued to crush the competition. Rival products from the semiconductor company Advanced Micro Devices (AMD), the wireless communications company Motorola, the computer workstation manufacturer Sun Microsystems, and others rarely threatened Intel’s market share. As a result, the Wintel duo consistently faced accusations of being monopolies. In 1999 Microsoft was found guilty in a U.S. district court of being a monopolist after being sued by the Department of Justice, while in 2009 the European Union fined Intel $1.45 billion for alleged monopolistic actions. In 2009, Intel also paid AMD $1.25 billion to settle a decades-long legal dispute in which AMD accused Intel of pressuring PC makers not to use the former’s chips.

By the mid-1990s Intel had expanded beyond the chip business. Large PC makers, such as IBM and Hewlett-Packard, were able to design and manufacture Intel-based computers for their markets. However, Intel wanted other, smaller PC makers to get their products and, therefore, Intel’s chips to market faster, so it began to design and build “motherboards” that contained all the essential parts of the computer, including graphics and networking chips. By 1995 the company was selling more than 10 million motherboards to PC makers, about 40 percent of the overall PC market. In the early 21st century the Taiwan-based manufacturer ASUSTeK had surpassed Intel as the leading maker of PC motherboards.

By the end of the century, Intel and compatible chips from companies like AMD were found in every PC except Apple Inc.’s Macintosh, which had used CPUs from Motorola since 1984. Craig Barrett, who succeeded Grove as Intel CEO in 1998, was able to close that gap. In 2005 Apple CEO Steven Jobs shocked the industry when he announced future Apple PCs would use Intel CPUs. Therefore, with the exception of some high-performance computers, called servers, and mainframes, Intel and Intel-compatible microprocessors can be found in virtually every PC.

Paul Otellini succeeded Barrett as Intel’s CEO in 2005. Jane Shaw replaced Barrett as chairman in 2009, when the company was ranked 61st on the Fortune 500 list of the largest American companies.

Hewlett-Packard Company

Hewlett-Packard Company, American manufacturer of computers, computer peripherals, and instrumentation equipment. Headquarters are in Palo Alto, California.
The garage in Palo Alto, California, where William Hewlett and David Packard began building …
[Credit: Reproduced with permission of the Hewlett-Packard Company Archives]

Founding and early growth

The company was founded on January 1, 1939, by William R. Hewlett and David Packard, two recent electrical-engineering graduates of Stanford University. It was the first of many technology companies to benefit from the ideas and support of engineering professor Frederick Terman, who pioneered the strong relationship between Stanford and what eventually emerged as Silicon Valley. The company established its reputation as a maker of sophisticated instrumentation. Its first customer was Walt Disney Productions, which purchased eight audio oscillators to use in the making of its full-length animated film Fantasia (1940). During World War II the company developed products for military applications that were important enough to merit Packard a draft exemption, while Hewlett served in the Army Signal Corps. Throughout the war the company worked with the Naval Research Laboratory to build counter-radar technology and advanced artillery shell fuses.

Postwar growth

After the war, Packard became responsible for the company’s business, while Hewlett led its research and development efforts. Following a postwar slump in defense contracts, in 1947 Hewlett-Packard returned to the revenue levels of the war years and grew continuously thereafter through a strategy of product diversification. One of its most popular early products was a high-speed frequency counter that it introduced in 1951. It was used in the rapidly growing market of FM radio and television broadcast stations for precisely setting signal frequencies according to Federal Communications Commission regulations. Military sales during the Korean War also boosted company revenues.
To help fund the development of new products, Hewlett-Packard raised money by issuing public stock in 1957. In addition, it embarked on a long campaign of expanding its product line by acquiring companies, beginning the year after it went public with the purchase of F.L. Moseley Company, a maker of graphic recorders. In 1961 it began its climb to status as a medical-instrument manufacturer with the purchase of Sanborn Company.
Hewlett-Packard’s HP-35 calculator, 1972.
[Credit: Reproduced with permission of the Hewlett-Packard Company Archives]In 1964 Hewlett-Packard instrumentation gained international recognition in a technological publicity stunt. Company engineers flew around the world with its cesium beam HP 5060A instrument to synchronize the globe’s atomic clocks to within one-millionth of a second. Four years later the company introduced the first desktop calculator. In 1972, using advanced integrated-circuit technology, Hewlett-Packard unveiled the first pocket-sized calculator. Selling at one-sixth the price of the original desktop unit, the pocket calculator eventually forced the obsolescence of the venerable slide rule.
Although the company never developed weapons systems, it depended heavily throughout its history on military spending, because its instrumentation has been used to develop and test military products, particularly as weapons systems have become more dependent on electronic and semiconductor technologies. The military expertise of Hewlett-Packard was underscored in 1969 when U.S. Pres. Richard M. Nixon appointed Packard deputy secretary of defense, in which position he oversaw the initial plans for the development of two of the country’s most successful jet fighter programs, the F-16 and the A-10.

Computer business

Hewlett-Packard’s first computer, the HP 2116A, was developed in 1966 specifically to manage the company’s test and measurement devices. In 1972 the company released the HP 3000 general-purpose minicomputer—a product line that remains in use today—for use in business. In 1976 an engineering intern at the company, Stephen G. Wozniak, built a prototype for the first personal computer (PC) and offered it to the company. Hewlett-Packard declined and gave Wozniak all rights to his idea; later he joined with Steven P. Jobs to create Apple Computer, Inc. (now Apple Inc.).

Hewlett-Packard introduced its first desktop computer, the HP-85, in 1980. Because it was incompatible with the IBM PC, which became the industry standard, it was a failure. The company’s next major foray into the PC market was with the HP-150, an IBM PC-compatible system that had a touch screen. Although technically interesting, it also failed in the marketplace. The company’s first successful product for the PC market was actually a printer. The HP LaserJet appeared in 1984 to rave reviews and huge sales, becoming Hewlett-Packard’s single most successful product.

In the mid-1980s Hewlett-Packard found itself losing business in its core fields of science and engineering to rival computer workstation companies such as Sun Microsystems, Inc., Silicon Graphics, Inc., and Apollo Computer. In 1989 Hewlett-Packard bought Apollo to become the number one workstation maker, a position it has since shared on and off with Sun.

As the 1990s began, the company missed some revenue and profit targets, causing a steep decline in its stock price. As a result, Packard came out of retirement to take an active role in the management of the company. The most dramatic changes came in its lacklustre PC group with the introduction of new computers, colour printers, and peripherals at low prices that made the company one of the world’s top three PC manufacturers. In 1993, with the company turnaround complete, Packard retired again. In 1997 Hewlett-Packard became one of the 30 companies whose stock price makes up the Dow Jones Industrial Average of the New York Stock Exchange. In 1999 the company spun off its measurement, electronic components, and medical businesses as Agilent Technologies, though it retained a majority of the new company’s common stock until 2000. Also during the 1990s, Hewlett-Packard collaborated with the Intel Corporation, an integrated circuit manufacturer, in the design of the 64-bit Itanium microprocessor, which was introduced in 2001.
Hewlett-Packard acquired the Compaq Computer Corporation, a major American PC manufacturer, in 2002. The move, made at the urging of the recently hired chief executive officer, Carly Fiorina, the first woman to lead a company listed in the Dow Jones, was bitterly opposed by some members of the company’s board of directors and certain major stockholders, including Walter Hewlett, son of the company’s cofounder. When the supposed benefits of the merger failed to materialize, she was forced out in 2005. Nevertheless, the company soon turned its balance sheet around, and in 2007 Hewlett-Packard became the first technology company to exceed $100 billion in sales revenue for a fiscal year (after first passing IBM in revenue the year before).
Hewlett-Packard continued its acquisition strategy in 2010 with a decision to acquire Palm, Inc., an American manufacturer of personal digital assistants (PDAs) and smartphones. Palm’s position in the highly competitive smartphone market was weak, but its multitasking operating system, known as webOS (a “next generation” successor to the original Palm OS), was considered by analysts to be a leading system for smartphones. The acquisition would complement Hewlett-Packard’s two lines of iPAQ smartphones, one for business users and one for consumers, that ran Microsoft Corporation’s Windows Mobile OS.

During the 2000s, Hewlett-Packard expanded its worldwide operations by opening research laboratories in Bangalore, India (2002), Beijing, China (2005), and St. Petersburg, Russia (2007); these joined a list that included laboratories in Bristol, England (1984), Tokyo, Japan (1990), and Haifa, Israel (1994).

Management approach

Early in the company’s history, the two founders endorsed formal management procedures, and Hewlett-Packard was one of the first corporations to use the “management by objective” approach. They also created an informal workplace, encouraging the use of first names among employees, even for themselves. Packard and Hewlett were also known for “management by walking around,” visiting as many departments as possible without appointments or scheduled meetings and talking with line workers as often as with managers in order to understand how the company was operating. Hewlett-Packard became one of the first businesses in the United States to endorse the idea that employees, customers, and the community have as valid an interest in company performance as do shareholders. As a result, it consistently ranked among the best places to work for women and minorities. It also became one of the leading contributors to charitable organizations, donating as much as 4.4 percent of its pretax profits.

Wednesday, November 17, 2010

Sony Corporation

Sony Corporation, Japanese Sony Kk, major Japanese manufacturer of consumer electronics products.

Rice cookers to transistor radios

The company was incorporated by Ibuka Masaru and Morita Akio in 1946 as Tokyo Tsushin Kogyo (“Tokyo Telecommunications Engineering Corporation”). Ibuka, whose Japan Precision Instruments Company had supplied electronic devices during World War II, and Morita, an applied sciences instructor, had met during World War II as engineers designing heat-seeking missiles for the Imperial Japanese Army. Ibuka and Morita worked together for the next 40 years in what has been called one of “business history’s most productive and intriguing relationships.” Ibuka’s genius with product development and Morita’s mastery of business management and marketing turned Sony into one of the most renowned brand names on the globe. Sony, which became the official name for the company in January 1958, was derived from the Latin sonus (“sound”) and was conceived to be an international and not a Japanese term.
The company’s first consumer product was an electric rice cooker. Although this product sold poorly, Totsuko, as the firm’s name was abbreviated, did have a successful business repairing radios and other electrical equipment. Its repair work for the Japanese radio broadcaster NHK had to be approved by the U.S. Army of Occupation, which later gave the young company repair jobs of its own.
In 1950 Totsuko introduced the first Japanese-designed tape recorder. Although this consumer item also sold poorly, the company’s fortunes were about to take a dramatic turn. In 1952 Ibuka visited the United States and made the initial contacts for licensing the transistor from Bell Laboratories, then a division of Western Electric Company, the manufacturing arm of American Telephone & Telegraph (AT&T). The next year Morita went to the United States and signed the deal with Western Electric.
This watershed agreement led to Totsuko’s first hugely successful product line: transistor radios. Although Texas Instruments Incorporated was first to market with its Regency transistor radio in 1955, it was Sony’s TR-63, an inexpensive shirt-pocket-sized all-transistor radio, that caught consumers’ attention when it was released in 1957. Sony’s pocket radios were a tremendous success and brought international recognition of the company’s brand name.

Electronics giant

By 1960 business in the United States prompted the creation of Sony Corporation of America, with headquarters in New York City. When the company opened its store in 1962 on Fifth Avenue, it unfurled the first Japanese flag to be flown in the United States since the beginning of World War II.
At the 1964 New York World’s Fair, Sony introduced the MD-5, the first all-transistor desktop calculator. In 1968 the company shipped its first Trinitron colour television. By 1971, 40 percent of Japanese households had colour television sets, so Sony introduced the first colour video cassette recorder (VCR), which led to its introduction of the Betamax VCR in 1975. The Betamax, though widely considered the best VCR technology ever developed, was more expensive than its competitor, the VHS (Video Home System). As more and more studios and video stores turned to VHS, Betamax lost market share, andSony finally introduced its own VHS in 1988.
In 1979 the Sony Walkman portable tape player hit the streets. Although Sony’s engineers were skeptical about designing a device that could only play and not record, Morita insisted on developing the product, saying he would resign if the Walkman was not a success. The Walkman was an international sensation and eventually sold hundreds of millions of units. The first compact disc (CD) player emerged in 1982 from a development agreement betweenSony and Dutch manufacturer Philips Electronics NVSony provided pulse-code modulation technology and combined it with Philips’s laser system. The failure of Betamax had taught Sony a lesson; the format standard for CDs (and later digital videodiscs [DVDs]) was agreed upon by a wide range of companies in Japan, Europe, and North America. The next year Sony introduced the first camcorder.

Diversification and downturn

By the late 1980s, Sony executives, especially the company president and the chairman of Sony Corporation of America, Norio Ohga, wanted to add entertainment content to Sony’s operations. In 1988 it bought CBS Records Group from CBS Inc. (now CBS Corporation), thus acquiring the world’s largest record company, and the next year it purchased Columbia Pictures Entertainment, Inc. The Columbia acquisition, the largest to that time of an American company by a Japanese firm, ignited a controversy in the United States. The controversy was fanned by Morita’s contribution to The Japan That Can Say No,an essay written with Japanese nationalist Shintaro Ishihara in 1989. They claimed that Japan no longer depended on the United States and was a stronger, better nation than its postwar ally.
AIBO entertainment robot, model ERS-111.
[Credit: Courtesy of Sony Electronics Inc.]The early 1990s were difficult years for Sony. The Japanese economy entered a decade-long recession, and both Ibuka and Morita suffered strokes (in 1992 and 1993, respectively). Morita officially retired in 1994 and died in 1999. With its founders no longer at the controls, Sony declared its first loss, more than $200 million, in 1993. Despite the business turmoil, Sony continued to design and deliver new products. In 1994 its entertainment division introduced its PlayStation video game console to the Japanese market. By 2002 the game unit was contributing more than 10 percent of the company’s yearly revenues. Another major profit centre was SonyOnline Entertainment, particularly its Internet virtual reality game EverQuest. The company’s entertainment group also captured the imagination of many people with its robot dog, AIBO, introduced in 1999. In 1997 Sony introduced the VAIO line of personal computers. The VAIO was a high-quality and expensive system that the company marketed to users interested in developing or playing multimedia programs.
In 2005, following further disappointing annual financial reports, Howard Stringer was elevated from chairman and chief executive officer of the SonyCorporation of America to chairman and chief executive officer of the Sony Corporation. Although the appointment of a non-Japanese to head the parent company surprised many, some two-thirds of Sony’s employees worldwide are non-Japanese. In 2009 Stringer also became president of Sony’s electronics division.