วันอังคารที่ 6 มีนาคม พ.ศ. 2555

Chapter 13 : Copyright and Fair Use

Fair use: how much is too much?

Page view "journalism" and content aggregation are the cornerstones of the explosive growth of Gawker, Huffington Post and slews of other blogs, but does that growth come at the expense of the publications who conducted the original reporting? Every editor and publisher should be well-versed in fair use standards in order to take advantage of aggregation opportunities themselves while also protecting their copyrighted materials from story harvesters.

Fair use basics

Fair use criteria

  1. the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
  2. the nature of the copyrighted work;
  3. the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
  4. the effect of the use upon the potential market for or value of the copyrighted work.
The problem with fair use is that the criteria for determining fair use (listed on the right) are vague. Stanford University has a great resource for measuring whether or not the use of copyrighted material fits with fair use guidelines, and I'll refer you to that story rather than rehash those details here. The Digital Millennium Copyright Act allows for copyright owners to ask an InternetService Provider to take down content that violates their copyright, but U.S. District Judge Jeremy Fogel of San Jose, California ruled in 2008 that copyright holders cannot order a deletion of an online file without determining whether that posting reflected "fair use" of the copyrighted material. I've provided some examples of aggregation at the bottom of this post. You might be surprised by who's violating copyright and by how much.

Enforcing your copyright: a revenue opportunity?

How can publishers better protect their copyrighted works and maximize their revenue impact? There are several services that are available that will help you identify copyright infringements, contact the website owner and offer them the opportunity to buy a license to the content. We've previously profiled Attributor, an anti-piracy service and toolset, whose services help identify copyright infringements and monetize them. Their model is not about content removal but content monetization, and they claim an 85% response rate on their initial emails to copyright offenders. iCopyright offers publishers the ability to do a one-time search for copyright violations on the Web or subscribe to the service monthly. They also offer the tools to manage your licenses and enable self-service for licensing customers. Be mindful that when you engage one of these companies, they'll find infringements from advertisers. Have a game plan for how to handle those situations.

Examples of aggregation efforts

Many publishers are launching content aggregation efforts in order to gain traffic.
Smart Brief has built a substantial business creating email newsletters 100% driven by curated aggregation of vertical news. These newsletters contain small snippets from a news story, a headline, and a link. They are for commercial purpose, but they use very little of the original work and enhance the market for that work. Fair use.
Editor and Publisher fair use or copyright infringementOn the opposite end of the spectrum, Editor & Publisher(E&P) fired their editorial team and have resorted to using aggregation instead of reporting on news themselves. While this is not necessarily a bad strategy, the execution of this strategy seems to push the boundaries of what's legal. E&P is taking large chunks of content verbatim and posting them to their own site under byline of "Editor & Publisher Staff." There is no value added to the content, it replicates the most pertinent parts of the story and damages the original source by robbing them of page views. In fact, I don't think I'm the only one that feels that way. E&P replicated a large chunk of a New York Times article yesterday morning, and it was removed later in the afternoon. I took a screen shot of the page (click to enlarge), and here's the link. It's unfortunately ironic that a publication that had been a watchdog for journalism is now engaging in practices that undermine it. (They are also posting press releases under a byline as well. Ugh.) Questionable use; probably too much.
MediaGazer is one of my favorite aggregators in the media space. MediaGazer scans media publications, newspapers and blogs for stories pertinent to the media. It attempts to credit the original source with a large headline and teaser and aggregators with links beneath the teaser. Part automated service and part Megan McCarthy's feel for what will drive usage, this service is a great example of how sites can add value while aggregating. Fair use.
A reader pointed out The Daily Caller, and I was shocked by the flagrant use of copyrighted material. In this particular post, they have not only lifted 646 words directly from the New York Times, they took the photo and attributed it to the NYT instead of Getty Images. While thumbnails are considered fair use, I don't believe a reasonable person would consider this example fair. There is no value added, and this is clearly an attempt to take a piece of an article and have it stand on its own so that users won't click. Lawsuit waiting to happen... too much.

Final thoughts

If you are planning on launching an aggregation effort, have your editors spend some time with your corporate counsel to train them more deeply in this area. When I was at Penton Media, we spent time each year re-training our editors on issues relating to copyright, libel and slander. If you plan to do the same, aggregation and fair use should clearly be part of that training today. 


source : http://emediavitals.com/content/fair-use-how-much-too-much

Chapter 12 : Knowledge Management

Knowledge Management caters to the critical issues of organizational adaption, survival and competence in face of increasingly discontinuous environmental change.... Essentially, it embodies organizational processes that seek synergistic combination of data and information processing capacity of information technologies, and the creative and innovative capacity of human beings.
 
We are observing increasing hype about the wonders delivered by newest information technologies in an era characterized by knowledge as the critical resource for business activity. With the advent of new technologies, such as datamining, intranets, videoconferencing, and webcasting, several technologists are offering such solutions as a panacea for meeting the business challenges of the knowledge era. Trade press coverage of the 'productivity paradox' has further added to the speed of the information technology (IT) treadmill by suggesting that increasing investments in new information technologies should somehow result in improved business performance. For instance, some recent stories published in the trade press have asserted that certain technologies, such as intranets, have some inherent capability for facilitating organizational transformation initiatives, such as knowledge management. 
Interestingly, some technology experts and academic scholars have observed that there is no direct correlation between IT investments and business performance or knowledge management. For instance, Erik Brynjolfsson, a professor at MIT Sloan School, notes that: "The same dollar spent on the same system may give a competitive advantage to one company but only expensive paperweights to another." Hence a key factor for the higher return on the IT dollar is the effective utilization of technology. How industry executives should go about deciphering the mantra of 'effective utilization,' however, remains an illusive issue. This conclusion is supported by the industrywide analyses of IT investments by the technology economist Paul Strassmann. In his recent book The Squandered Computer, he has concluded that there is no relationship between computer expenditures and company performance whatsoever. On a similar note, John Seely Brown, director of the Xerox Parc research center in Palo Alto, California, underscores that in the last 20 years, US industry has invested more than $1 trillion in technology, but has realized little improvement in the efficiency or effectiveness of its knowledge workers. He has attributed this failure to organizations' ignorance of ways in which knowledge workers communicate and operate through the social processes of collaborating, sharing knowledge and building on each others ideas. 
Toward A New World Order of Business 
The contrast highlighted above may be attributed to a transition of the economy from an era of competitive advantage based on information to one based on knowledge creation. The earlier era was characterized by relatively slow and predictable change that could be deciphered by most formal information systems. During this period, information systems based on programmable recipes for success were able to deliver their promises of efficiency based on optimization for given business contexts. Success stories of IT miracles of this era, such as Mrs. Fields' Cookies, have been chronicled by the Harvard Business School case writers and many others in the academic and trade press. However, as argued by Brian Arthur, Dean of Economics and Population Studies at Stanford and author of Increasing Returns and Path Dependence in the Economy, the new world of knowledge-based industries is distinguished by its emphasis on precognition and adaptation in contrast to the traditional emphasis on optimization based on prediction. He suggests that the new world of knowledge-based business is characterized by "re-everything" involving continuous redefinition of organizational goals, purposes, and its "way of doing things." This new business environment is characterized by radical and discontinuous change which overwhelms the traditional organizational response of predicting and reacting based on pre-programmed heuristics. Instead, it demandsanticipatory response from organization members who need to carry out the mandate of a faster cycle of knowledge-creation and action based on the new knowledge. 
Knowledge Management in the 'Old' Information Era 
In the information era characterized by relatively predictable change, technology gurus, as well as hardware and software providers, have been offering out-of-box solutions that are expected to enable knowledge management. Such off-the-shelf solutions are expected to offer means for storing best practices devised by human experts in information databases which may be later used for crunching out the pre-determined solutions based on pre-defined parameters. For example, a Software Magazine article defined knowledge management in terms of understanding the relationships of data; Identifying and documenting rules for managing data; and assuring that data are accurate and maintain integrity. Similarly a Computerworld article defined knowledge management in terms of mapping knowledge and information resources both on-line and off-line. The convergent andconsensus building emphasis of such systems is suited for stable and predictable organizational environments. However such interpretations of knowledge management -- based primarily on rules and procedures embedded in technology -- seem misaligned with the dynamically changing business environment. 
Such programmed solutions may be good enough for devising strategies for a game of business that is based on pre-defined rules, conventions and assumptions. However such mechanistic solutions based on the traditional information-processing emphasis of knowledge management are increasingly inadequate in a business world that demands increasing flexibility and resurfacing of existing assumptions. This is the world which requires not playing by the pre-defined rules, but understanding and adapting as the rules of the game, as well as the game itself, keep changing. Examples of such changing rules, conventions and assumptions of business are suggested by the changing paradigms of organizations with the emergence of virtual corporations and business ecosystems. 
A Definition of Knowledge Management for the New World
We propose a definition of Knowledge Management that attempts to go beyond the quickfix solutions or unidimensional views offered by many others. This definition is intended to move the thinking of corporate executives towards the strategic, non-linear and systemic view of Knowledge Management reviewed in this article. 
"Knowledge Management caters to the critical issues of organizational adaption, survival and competence in face of increasingly discontinuous environmental change. Essentially, it embodies organizational processes that seek synergistic combination of data and information processing capacity of information technologies, and the creative and innovative capacity of human beings." Knowledge Management in the New World of Business 
The traditional paradigm of information systems is based on seeking a consensual interpretation of information based on socially dictated norms or the mandate of the company bosses. This has resulted in the confusion between 'knowledge' and 'information'. However, knowledge and information are distinct entities!! While information generated by the computer systems is not a very rich carrier of human interpretation for potential action, 'knowledge' resides in the user's subjective context of action based on that information. Hence, it may not be incorrect to state that knowledge resides in the user and not in the collection of information, a point made two decades ago by West Churchman, the leading thinker on information systems. 
Karl Erik Sveiby, the author of The New Organizational Wealth: Managing and Measuring Knowledge-Based Assets, contends that the confusion between `knowledge' and `information' has caused managers to sink billions of dollars in technology ventures that have yielded marginal results. He asserts that the business managers need to realize that unlike information, knowledge is embedded in people... and knowledge creation occurs in the process of social interaction. On a similar note, Ikujiro Nonaka, the renowned Professor of Knowledge, has emphasized that only human beings can take the central role in knowledge creation. He argues that computers are merely tools, however great their information-processing capabilities may be. A very recent Harvard Business Review special issue on Knowledge Management seems to lend credence to this point of view. This issue highlighted the need for constructive conflict in organizations that aspire to be leaders in innovation and creation of new knowledge. 
The 'wicked environment' of the new world of business imposes the need for variety and complexity of interpretations of information outputs generated by computer systems. Such variety is necessary for deciphering the multiple world views of the uncertain and unpredictable future. As underscored by the strategy guru Gary Hamel at the recent Academy of Management meeting address, non-linear change imposes upon organizations the need for devising non-linear strategies. Such strategies cannot be 'predicted' based on a static picture of information residing in the company's databases. Rather, such strategies will depend upon developing interpretive flexibility by understanding multiple views of the future. In this perspective, the objective of business strategy is not to indulge in long-term planning of the future. Rather, the emphasis is on understanding the various world views of future using techniques such as scenario-planning. A similar process of strategic planning was pioneered by Arie de Geus, the strategy chief of the multinational Royal Dutch/Shell, as chronicled in his recent book The Living Company
Lessons for Business & Technology Executives 
So what can executives do to realign their focus from the old world of 'information management' to the new paradigm of 'knowledge management' discussed here? A condensed checklist of implementation measures for business and technology managers is given in Table 1.
Table 1. Implementation Measures for Facilitating Knowledge Management
  • Instead of the traditional emphasis on controlling the people and their behaviors by setting up pre-defined goals and procedures, they would need to view the organization as a human community capable of providing diverse meanings to information outputs generated by the technological systems.
  • De-emphasize the adherence to the company view of 'how things are done here' and 'best practices' so that such ways and practices are continuously assessed from multiple perspectives for their alignment with the dynamically changing external environment.
  • Invest in multiple and diverse interpretations to enable constructive conflict mode of inquiry and, thus, lessen oversimplification of issues or premature decision closure.
  • Encourage greater proactive involvement of human imagination and creativity to facilitate greater internal diversity to match the variety and complexity of the wicked environment.
  • Give more explicit recognition to tacit knowledge and related human aspects, such as ideals, values, or emotions, for developing a richer conceptualization of knowledge management
  • Implement new, flexible technologies and systems that support and enable communities of practice, informal and semi-informal networks of internal employees and external individuals based on shared concerns and interests.
  • Make the organizational information base accessible to organization members who are closer to the action while simultaneously ensuring that they have the skills and authority to execute decisive responses to changing conditions.
Brook Manville, Director of Knowledge Management at McKinsey, views the implementation of these issues in terms of the shift from the traditional emphasis on transaction processing, integrated logistics and workflows to systems that support competencies for communications building, people networks, and on-the-job learning. He distinguishes between the three architectures needed for enabling such competencies: 
  • a new information architecture that includes new languages, categories, and metaphors for identifying and accounting for skills and competencies.
  • a new technical architecture that is more social, transparent, open, flexible, and shows respect for the individual users.
  • a new application architecture oriented toward problem solving and representation, rather than output and transactions.
Manville observes that technology will continue to yield disappointing results until IS managers and business executives realize that IT must provide a way to form communities, not simply provide communications. 
In the final analysis, managers need to develop a greater appreciation for their intangible human assets captive in the minds and experiences of their knowledge workers, because without these assets, the companies are simply not equipped with a vision to foresee or to imagine the future while being faced with a fog of unknowingness. As noted by Strassmann, elevating computerization to the level of a magic bullet may lead to the diminishing of what matters the most in any enterprise: educated, committed, and imaginative individuals working for organizations that place greater emphasis on people than on technologies.

A Toolbox for Knowledge Management Initiatives 
Any article on Knowledge Management would not be complete without providing leads for continuous self-learning on issues discussed in the article. Here is a short checklist of books, magazines and web sites that business and technology managers can utilize to get up to speed on the `new' paradigm of Knowledge Management.


source : http://km.brint.com/whatis.htm

วันอาทิตย์ที่ 12 กุมภาพันธ์ พ.ศ. 2555

Chapter 11 : Information systems

Global Solutions for the Energy Industry
Global Information Systems is a full service application development and Geographic Information Systems (GIS) products and services company focused on the energy industry. 




                             amway ‘แอมเวย์’ เกาะโลกออนไลน์ ปลูกถ่ายนักขายรุ่นใหม่เพิ่มแรงเหวี่ยง โลกออนไลน์ แอมเวย์ ประเทศไทย  

Global provides complete service solutions and customer configurable software products for our clients. Global specializes in software development and support, database implementation, GIS As-Builting, CAD-GIS integration, and expert consulting and CAD services.

Our products and services have been instrumental in facilitating change in Engineering, Operations and Integrity Management business processes within the energy transmission industry. Our team of expert, customer service focused staff will be happy to assist you in any way we can. Please contact us for more information on how Global Information Systems can help minimize costs while increasing efficiency, communication, and access to your information today!


source : http://www.globalinformationsystems.com/

Chapter 10 Information Technology

Chapter 10 Information Technology

8) Identify the Pros and Cons of Information Technology.

Information Technology Pros

1. The world got flexibility
What we think, do or plan must be shared with our co-workers, colleagues and friends. The internet technology has advanced this system to a great extent. The telephone idea (by Alexander Graham Bell) has been modified and made as Cell Phones to increase more flexibility in communication and talk to our dear fellows whenever we require!
2. The sense of responsibility has increased
Let us take ‘Barack Obama- USA President’ as the figure. With the use of networking sites (Twitter and Facebook), blogs, social bookmarking, the leader can approach to the world whenever necessary and we can receive the news and updates which he has done (or wants to be done by us) within a very short period.
3. Easy thinking & evolution in transportation

To think and to research, we need resources to find what our past people has thought, what quotes they have left for us (+information + theory). We can find them by a single click in search engines (specially Google, Yahoo!). By getting a clear cut idea, we get the chance of contributing the world with new technological ideas and inventions and share what we have learnt throughout our lives.
And throughout the ages, it helped us to evolute the transportation strategy which helps us to visit from a place to another by (Roads, highways, air, water and in the skies!)
4. Saves thousand of lives daily
So, by the point heading I hope you understand I am referring towards Medical Sector development. Each day people are getting relief by the perfect use of Medicine, Hospital Technology with addition of (X-Rays, Laser Treatments) and more on the queue. By the combination of the World Health Organization, various fatal diseases can be overcome and just expelled from specified countries by quick plans and ideas.
5. Increase the sense of Human Rights
The technology can remind of our human rights, basic needs and give updates where relief or worldwide help is necessary. During earthquakes, terrible floods, while co-operation is necessary the World Wide Web can help us to collect the donation by a desired amount.
It is not possible to just figure out everything about the good and bad sides of Information and Technology within a page and article as because it has mixed with every aspects and corners of our lives. Rather, let us look at the side effects, bad sides (cons) which IT-sectors have brought to the Human Society.

Sources:http://113tidbits.com/the-pros-and-cons-of-information-technology/3696/

Information Technology Cons

1.It has taken away people’s Privacy 
As IT-Sectors have wined the people’s heart worldwide. People are here to share and store any kind of information, private date in their hard drives and private online databases. But due to some Cyber-Criminals, nothing is SO Safe both online and offline. If someone becomes a bit careless, s/he may needs to pay high for it. (It’s serious).
2.The online community is not safe for Family anymore
Children under age may often share Cell Numbers, Private Email Address which can be hacked by people and can pass it to the criminals who have a blue-print to harm the society. And people are loosing credit card privacy and other payment processing options. Again, there are some sites created by Nasty Guys, which can lead under eighteen teenagers to a different path – That Is Going To Bring Harms To The Nation.
3. It is going to damage a Human’s Natural Power
We can think, gather human principles (ethical knowledge) and make co-operative relationships between friends and families. But due to harmful aspects of (IT) people are becoming fully technological based. And it can bring huge damage to the society as its taking away the natural thoughts and organic ideas.
4. It can bring World Destruction without Efficient Administration
This is an extra point which I am writing by remembering various Science Fictions. Great scholars have though about the matter wisely. Til now, we (humans) are possessing the leading place in the world and administering the computer technology. But a day MAY come when the technology is going to administer us in all aspects. It may probably happen that we are converted to the slaves of Technology.
So, by this cons I am not trying to tell that Technology is here to bring harms only, because I myself is a technological man who passes 24 hours browsing computer and talking on phones. But as a part of human society we need to give up a look at the both sides of IT.

Sources:http://113tidbits.com/the-pros-and-cons-of-information-technology/3696/

Chapter 9 Characteristics of good websites










I choose this one because this web is helpful. It help me to find everything by typing some word and i can see it immediately

KEY ELEMENTS OF AN EFFECTIVE WEBSITE

1. Appearance
2. Content
3. Functionality
4. Website Usability
5. Search Engine Optimization

Bias information


     Information bias is a type of cognitive bias, and involves e.g. distorted evaluation of information. Information bias occurs due to people's curiosity and confusion of goals when trying to choose a course of action.





Bias information(Example)

Global warming
See Examples of Bias in Wikipedia: Global warming
Homosexuality 
See Examples of Bias in Wikipedia: Homosexuality

Liberal Politicians

See Examples of Bias in Wikipedia: Liberal Politicians

Obama

See Examples of Bias in Wikipedia: Obama

Science and Evolution

See Examples of Bias in Wikipedia: Science and Evolution

Conspiracy Theories

See Examples of Bias in Wikipedia: Conspiracy theories





วันจันทร์ที่ 17 ตุลาคม พ.ศ. 2554

chapter 8 How do search engines work

The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.


      

Web search

                      


      
A web search engine is designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results often referred to as SERPS, or "search engine results pages". The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Unlike web directories, which are maintained by human editors, search engines operate algorithmically or are a mixture of algorithmic and human input.

                                                  
                                 http://www.seobook.com/relevancy/
How Search Engines Work: Search Engine Relevancy Reviewed
This article is a fairly comprehensive review of search engine relevancy algorithms, published by Aaron Wall on June 13, 2006. While some of the general details have changed, the major themes referenced in this article were still relevant when I reviewed it a year after publishing it.
However, when I reviewed it on January 12, 2011, there have been significant changes:


  • Yahoo! Search is now powered by Bing in the United States and Google in Japan.
  • Ask announced they were leaving the search space to focus on QnA, and their core search will be powered by another search engine.
  • A couple newer smaller search engines (like Blekko and DuckDuckGo) have launched.
  • Some foreign search engines that dominate their home markets (like Yandex and Baidu) are looking to become global players.
                                                                   
                   http://money.howstuffworks.com/youtube.htm
                    
                      YouTube. They designed the site to let people share videos with the rest of the world. In November 2005, Sequoia Capital invested more than $3 million in the site, and a month later YouTube emerged as a full-fledged Web destination. It didn't take long for the site to become popular, and in November 2006, Internet search engine goliath Google purchased YouTube for $1.65 billion.
As the company has grown, so has the scope of the videos on the site. In the early of YouTube, you could find videos showing interesting locations, crazy stunts and hilarious pranks. You can still find that sort of content today, but you'll also see political debates, musical performances, instructional videos and unfiltered war footage. In 2007, YouTube even provided members with a way to interact with potential United States presidential candidates. YouTube members submitted video questions, and CNN featured some of them in Democratic and Republican candidate debates.

                                           
                               
     http://computer.howstuffworks.com/internet/basics/search-engine1.htm
                                  
               When most people talk about Internet search engines, they really mean World Wide Web search engines. Before the Web became the most visible part of the Internet, there were already search engines in place to help people find information on the Net. Programs with names like "gopher" and "Archie" kept indexes of files stored on servers connected to the Internet, and dramatically reduced the amount of time required to find programs and documents. In the late 1980s, getting serious value from the Internet meant knowing how to use gopher, Archie, Veronica and the rest.
Today, most Internet users limit their searches to the Web, so we'll limit this article to search engines that focus on the contents of Web pages


                                      


                  http://en.wikipedia.org/wiki/Web_search_engine          
                                      
                            The popularity of the personal computer as a business tool has a lot to do with a company founded by two men, Paul Allen and Bill Gates. In 1975 the duo wrote a version of BASIC for one of the very first personal computers, the Micro Instrumentation and Telemetry Systems (MITS) Altair [source: Microsoft]. It wouldn't be long before their success would lead them to found their own software company called Micro-Soft.
Now, after more than 30 years, one corporate name change and several operating systems later, Microsoft is on top of the computer world. In the meantime, Gates and Allen have become billionaires, with Gates reigning as the richest man in the world. Even an $18 billion loss in 2008 didn't knock him off the top


                                                                                               

Shapter 8 how do search engines work

              The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.


                                      

Web search

         From Wikipedia, the free encyclopedia
    
         A web search engine is designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results often referred to as SERPS, or "search engine results pages". The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Unlike web directories, which are maintained by human editors, search engines operate algorithmically or are a mixture of algorithmic and human input. 

                                         
 
    How Search Engines Work: Search Engine Relevancy Reviewed
This article is a fairly comprehensive review of search engine relevancy algorithms, published by Aaron Wall on June 13, 2006. While some of the general details have changed, the major themes referenced in this article were still relevant when I reviewed it a year after publishing it.
However, when I reviewed it on January 12, 2011, there have been significant changes:
  • Yahoo! Search is now powered by Bing in the United States and Google in Japan.
  • Ask announced they were leaving the search space to focus on QnA, and their core search will be powered by another search engine.
  • A couple newer smaller search engines (like Blekko and DuckDuckGo) have launched.
  • Some foreign search engines that dominate their home markets (like Yandex and Baidu) are looking to become global players.
                                      

YouTube. They designed the site to let people share videos with the rest of the world. In November 2005, Sequoia Capital invested more than $3 million in the site, and a month later YouTube emerged as a full-fledged Web destination. It didn't take long for the site to become popular, and in November 2006, Internet search engine goliath Google purchased YouTube for $1.65 billion.
As the company has grown, so has the scope of the videos on the site. In the early of YouTube, you could find videos showing interesting locations, crazy stunts and hilarious pranks. You can still find that sort of content today, but you'll also see political debates, musical performances, instructional videos and unfiltered war footage. In 2007, YouTube even provided members with a way to interact with potential United States presidential candidates. YouTube members submitted video questions, and CNN featured some of them in Democratic and Republican candidate debates.


                                              

                  When most people talk about Internet search engines, they really mean World Wide Web search engines. Before the Web became the most visible part of the Internet, there were already search engines in place to help people find information on the Net. Programs with names like "gopher" and "Archie" kept indexes of files stored on servers connected to the Internet, and dramatically reduced the amount of time required to find programs and documents. In the late 1980s, getting serious value from the Internet meant knowing how to use gopher, Archie, Veronica and the rest.
Today, most Internet users limit their searches to the Web, so we'll limit this article to search engines that focus on the contents of Web pages


                                                     



                                      
The popularity of the personal computer as a business tool has a lot to do with a company founded by two men, Paul Allen and Bill Gates. In 1975 the duo wrote a version of BASIC for one of the very first personal computers, the Micro Instrumentation and Telemetry Systems (MITS) Altair [source: Microsoft]. It wouldn't be long before their success would lead them to found their own software company called Micro-Soft.
Now, after more than 30 years, one corporate name change and several operating systems later, Microsoft is on top of the computer world. In the meantime, Gates and Allen have become billionaires, with Gates reigning as the richest man in the world. Even an $18 billion loss in 2008 didn't knock him off the top