Is self-linking a tear in the fabric of the web?

More often than not, web and blog publishers are linking to their own related stories on the same or related web properties instead of providing out-bound links. Topic pages like those found on CNN accomplishe this in search engine-like fashion. The objective, of course, is to keep readers on their sites instead of going to search engines for related information. But is this a tear in the fabric of the web? Tim O'Reilly thinks it is and in Is Linking to Yourself the Future of the Web? he offers two guidelines for anyone adopting this "link to myself" strategy:

  1. Ensure that no more than 50% of the links on any page are to yourself. (Even this number may be too high.)
  2. Ensure that the pages you create at those destinations are truly more valuable to your readers than any other external link you might provide.


September 4, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Theo Schlossnagle on Internet Traffic Spikes

Theo Schlossnagle analyzes how Internet traffic spikes are shifting in Dissecting Today's Internet Traffic Spikes. Hat tip to Jesse Robbins, O'Reilly Radar. [JH]

July 15, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Investigation of P2P Copyright Enforcement

Michael Piatek, Tadayoshi Kohno and Arvind Krishnamurthy (University of Washington, Department of Computer Science & Engineering) have published the results of their study of P2P copyright enforcement actions.

From the overview:

As people increasingly rely on the Internet to deliver downloadable music, movies, and television, content producers are faced with the problem of increasing Internet piracy. To protect their content, copyright holders police the Internet, searching for unauthorized distribution of their work on websites like YouTube or peer-to-peer networks such as BitTorrent. When infringement is (allegedly) discovered, formal complaints are issued to network operators that may result in websites being taken down or home Internet connections being disabled.

Although the implications of being accused of copyright infringement are significant, very little is known about the methods used by enforcement agencies to detect it, particularly in P2P networks. We have conducted the first scientific, experimental study of monitoring and copyright enforcement on P2P networks and have made several discoveries which we find surprising.

Practically any Internet user can be framed for copyright infringement today. By profiling copyright enforcement in the popular BitTorrent file sharing system, we were able to generate hundreds of real DMCA takedown notices for computers at the University of Washington that never downloaded nor shared any content whatsoever. Further, we were able to remotely generate complaints for nonsense devices including several printers and a (non-NAT) wireless access point. Our results demonstrate several simple techniques that a malicious user could use to frame arbitrary network endpoints.

Even without being explicitly framed, innocent users may still receive complaints. Because of the inconclusive techniques used to identify infringing BitTorrent users, users may receive DMCA complaints even if they have not been explicitly framed by a malicious user and even if they have never used P2P software! Software packages designed to preserve the privacy of P2P users are not completely effective. To avoid DMCA complaints today, many privacy conscious users employ IP blacklisting software designed to avoid communication with monitoring and enforcement agencies. We find that this software often fails to identify many likely monitoring agents, but we also discover that these agents exhibit characteristics that make distinguishing them straightforward.

Hat tip to Christine Corcos (LSU), Media Law Prof Blog. [JH]

June 23, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

McAfee Identifies Most Web Dangerous Domains to Surf and Search

The second annual McAfee Mapping the Mal Web report (pdf) on the riskiest and safest places on the Web reveals that 19.2% of all Web sites ending in the ".hk" domain pose a security threat to Web users. China (.cn) is second this year with over 11%. By contrast, Finland (.fi) remains the safest online destination for the second year with 0.05%, followed by Japan (.jp). In addition to country domains, McAfee found the .info domain to be the most dangerous generic domain. According to the Report, as much as 11.8 percent of all .info domains pose legitimate security threats. The fewest number of sites likely to pose a security are .gov government sites and the most popular domain on the Internet, .com, is the ninth riskiest overall.

Other key findings from McAfee's 2008 Mapping the Mal Web report include:

The McAfee report is based on results from 9.9 million Web sites that were tested in 265 domains for serving malicious code, excessive pop-up ads or forms to fill out that actually are tools for harvesting e-mail addresses for sending spam. [JH]

June 11, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

New Book on the Practice and Policy of Global Internet Filtering

Access Denied provides the definitive analysis of government justifications for denying their own people access to some information and also documents global Internet filtering practices on a country-by-country basis. This is timely and important. --Jonathan Aronson, Annenberg School for Communication, University of Southern California

Access Denied: The Practice and Policy of Global Internet Filtering
Edited by by Ronald J. Deibert, John G. Palfrey, Rafal Rohozinski & Jonathan Zittrain.

List Price: $20.00
Paperback: 320 pages
Publisher: The MIT Press (February 29, 2008)
ISBN-10: 0262541963
ISBN-13: 978-0262541961

Product Description: Many countries around the world block or filter Internet content, denying access to information--often about politics, but also relating to sexuality, culture, or religion--that they deem too sensitive for ordinary citizens. Access Denied documents and analyzes Internet filtering practices in over three dozen countries, offering the first rigorously conducted study of this accelerating trend.

Internet filtering takes place in at least forty states worldwide including many countries in Asia and the Middle East and North Africa. Related Internet content control mechanisms are also in place in Canada, the United States and a cluster of countries in Europe. Drawing on a just-completed survey of global Internet filtering undertaken by the OpenNet Initiative (a collaboration of the Berkman Center for Internet and Society at Harvard Law School, the Citizen Lab at the University of Toronto, the Oxford Internet Institute at Oxford University, and the University of Cambridge) and relying on work by regional experts and an extensive network of researchers, Access Denied examines the political, legal, social, and cultural contexts of Internet filtering in these states from a variety of perspectives. Chapters discuss the mechanisms and politics of Internet filtering, the strengths and limitations of the technology that powers it, the relevance of international law, ethical considerations for corporations that supply states with the tools for blocking and filtering, and the implications of Internet filtering for activist communities that increasingly rely on Internet technologies for communicating their missions.

Reports on Internet content regulation in forty different countries follow, with each country profile outlining the types of content blocked by category and documenting key findings.

Contributors: Ross Anderson, Malcolm Birdling, Ronald Deibert, Robert Faris, Vesselina Haralampieva, Steven Murdoch, Helmi Noman, John Palfrey, Rafal Rohozinski, Mary Rundle, Nart Villeneuve, Stephanie Wang, and Jonathan Zittrain

April 11, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Cybercrime Legislation: 27 Country Profiles

The cybercrime national legislation profiles have been prepared within the framework of the Council of Europe’s Project on Cybercrime for sharing information on cybercrime legislation and assessing the current state of implementation of the Convention on Cybercrime under national legislation. Hat tip to beSpacific. [JH]

April 4, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

A Portrait of Early Internet Adopters

A recent Pew/Internet survey asked early Internet adopters why they first went online. The majority of respondents noted "to communicate with colleagues." In other words, to network via a new communications medium. "Social networking is nothing new. Remember BBSs and Usenet, chat rooms and threaded discussions" writes Amy Tracy Wells in A Portrait of Early Internet Adopters: Why People First Went Online --and Why They Stayed. [JH]

March 5, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Pew/Internet's Survey on Problem-Solving Information Search Patterns

In a national phone survey, respondents were asked by Pew/Interent whether they had encountered 10 possible problems in the previous two years, all of which had a potential connection to the government or government-provided information. Those who had dealt with the problems were asked where they went for help and the internet topped the list:

58% of those who had recently experienced one of those problems said they used the internet (at home, work, a public library or some other place) to get help.

53% said they turned to professionals such as doctors, lawyers or financial experts.

45% said they sought out friends and family members for advice and help.

36% said they consulted newspapers and magazines.

34% said they directly contacted a government office or agency.

16% said they consulted television and radio.

13% said they went to the public library.

For details, check out Pew/Internet's Information Searches That Solve Problems [Report (pdf)]. [JH}

February 22, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Resource Documents the History of Silicon Valley

The History of Silicon Valley, the Internet & the PC collects archived interviews with early innovators in all three fields, early photographs, and links to detailed research papers.  Very interesting. Hat tip to LISNews. [JH]

February 11, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Is the Google Generation a Myth?

Yes, according to Information Behaviour of the Researcher of the Future (pdf), a new report commissioned by JISC and the British Library.

From the press release:

[The report] counters the common assumption that the ‘Google Generation’ – young people born or brought up in the Internet age – is the most adept at using the web. The report by the CIBER research team at University College London claims that, although young people demonstrate an ease and familiarity with computers, they rely on the most basic search tools and do not possess the critical and analytical skills to asses the information that they find on the web. The report Information Behaviour of the Researcher of the Future also shows that research-behaviour traits that are commonly associated with younger users – impatience in search and navigation, and zero tolerance for any delay in satisfying their information needs – are now the norm for all age-groups, from younger pupils and undergraduates through to professors.

{Emphasis added]


January 29, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

TechPresident's 2007 Campaign Web Index

TechPresident's 2007 Campaign Web Index is an opinion survey that asked 13 questions to determine which presidential campaigns were best at using the various elements of the web. The panel judged Ron Paul and Barack Obama to have the best overall web presences, and they also led their respective fields in the most individual categories. Mike Huckabee and John Edwards followed, with each earning strong support from our panel. But while these four campaigns were the leaders, there were many surprises in specific categories. For example, Hillary Clinton and Mitt Romney scored the most points for their online rapid response work.

The survey asked the following questions:

  1. Which campaign has made the best use of online video?
  2. Which campaign has made the best use of email?
  3. Which campaign has made the best use of online social networking?
  4. Which campaign has made the best use of its blog?
  5. Which campaign has done the best work engaging online political activists?
  6. Which campaign has made the best use of "Web 2.0" techniques like RSS, widgets and tagging?
  7. Which campaign is doing the best work spending money on online advertising?
  8. Which campaign is making the best use of mobile technology?
  9. Which campaign is doing the best work raising money using online tools?
  10. Which campaign has done the best job of informing voters about their candidate's position on issues?
  11. Which campaign has done the best "rapid response" work online?
  12. Which campaign has made the best use of the web to decentralize power?
  13. Which campaign has the best overall web presence?


January 11, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

The First 100 Dot Com Domains

The below list has been making the web rounds lately. No Microsoft but Xerox, IBM, Sun and Intel jumped on the Internet bandwagon pretty early in the game.

Where are they today? Here's the web page placeholder text from the first listed site:

"Symbolics is currently a privately held company which acquired the assets and intellectual property of the old public company called Symbolics, Inc. The old Symbolics was the premier producer of special-purpose computer systems for running and developing state-of-the-art object-oriented programs in Lisp. It designed and built workstations as well as writing a fully object-oriented operating system and development environment called "Genera" to run on those workstations. Symbolics also created a number of software tools to work with Genera.  The new Symbolics continues to sell and maintain these products, along with Open Genera which runs on Alpha processor based workstations running Tru64 Unix.  If you would like to know why you should be interested in developing your application in Genera, click here to see 25 reasons. Symbolics also distributes the Macsyma and PDEase software products for Windows PCs."

March 15 1985
April 24 1985
May 24 1985
July 11 1985
September 30 1985
November 7 1985
January 9 1986
January 17 1986
March 3 1986
March 5 1986
March 19 1986
March 19 1986
March 25 1986
March 25 1986
April 25 1986
May 8 1986
May 8 1986
July 10 1986
July 10 1986
August 5 1986
August 5 1986
August 5 1986
August 5 1986
August 5 1986
August 5 1986
September 2 1986
September 18 1986
September 29 1986
October 18 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
October 27 1986
November 5 1986
November 5 1986
November 17 1986
November 17 1986
November 17 1986
November 17 1986
November 17 1986
November 17 1986
November 17 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
December 11 1986
January 19 1987
January 19 1987
January 19 1987
February 19 1987
March 4 1987
March 4 1987
April 4 1987
April 23 1987
April 23 1987
April 23 1987
April 23 1987
April 30 1987
May 14 1987
May 14 1987
May 20 1987
May 27 1987
May 27 1987
June 26 1987
July 9 1987
July 13 1987
July 27 1987
July 27 1987
July 28 1987
August 18 1987
August 31 1987
September 3 1987
September 3 1987
September 3 1987
September 22 1987
September 22 1987
September 22 1987
September 22 1987
September 30 1987
October 14 1987
November 2 1987
November 9 1987
November 16 1987
November 16 1987
November 24 1987
November 30 1987


January 4, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

Beware of Potentially Deceptive Fee-Based Music Download Services

Check out The Center for Democracy & Technology's Music Download Warning List. The list identifies providers who offer but may not have large inventories of music files and who also may not have obtained permission to sell the files they list. [JH]

January 3, 2008 in Internet, General | Permalink | Comments (0) | TrackBack

RIP Netscape Navigator

AOL has announced that Netscape Navigator, the browser that launched the commercial Internet in October 1994, will die on February 1, 2008. AOL acquired Netscape in November 1998 for $4.2 billion (ouch). [JH]

December 31, 2007 in Internet, General | Permalink | Comments (0) | TrackBack

New Protocol To Limit News Aggregators

Newspaper publishers want to implement a new protocol that would define permissions and restrictions for content on news websites.  The Automated Content Access Protocol would tell search engines what they can and can't use, and for how long.  Hat tip to Tech Law Prof Blog. [JH]

December 27, 2007 in Internet, General | Permalink | Comments (0) | TrackBack

Everyone’s Guide to By-Passing Internet Censorship

Interesting guide from Citizen Lab: "This guide is meant to introduce non-technical users to Internet censorship circumvention technologies, and help them choose which of them best suits their circumstances and needs."   [RJ]

December 13, 2007 in Internet, General | Permalink | Comments (0) | TrackBack

The Internet Singularity, Delayed

In The Internet Singularity, Delayed: Why Limits in Internet Capacity Will Stifle Innovation on the Web, Nemertes performed an in-depth analysis of Internet and IP infrastructure (capacity) and current and projected traffic (demand) with the goal of understanding how each has changed over time, and determining if there will ever be a point at which demand exceeds capacity. 

From the executive summary:

To assess infrastructure capacity, we reviewed details of carrier expenditures and vendor revenues, and compared these against market research studies. To compute demand, we took a unique approach: Instead of modeling user behavior based on measuring the application portfolios that users had currently deployed, and projecting deployment of those applications in future, we looked directly at how user consumption of available bandwidth has changed over time.


Our findings indicate that although core fiber and switching/routing resources will scale nicely to support virtually any conceivable user demand, Internet access infrastructure, specifically in North America, will likely cease to be adequate for supporting demand within the next three to five years. We estimate the financial investment required by access providers to bridge the gap between demand and capacity ranges from $42 billion to $55 billion, or roughly 60%-70% more than service providers currently plan to invest.

It’s important to stress that failing to make that investment will not cause the Internet to collapse. Instead, the primary impact of the lack of investment will be to throttle innovation” both the technical innovation that leads to increasingly newer and better applications, and the business innovation that relies on those technical innovations and applications to generate value. The next Google, YouTube, or Amazon might not arise, not because of a lack of demand, but due to an inability to fulfill that demand. Rather like osteoporosis, the underinvestment in infrastructure will painlessly and invisibly leach competitiveness out of the economy.

Hat tip to beSpacific. [JH]

December 4, 2007 in Internet, General | Permalink | Comments (0) | TrackBack

New Book Offers Analysis of the Culture and History of the Computer Virus Phenomenon

Digital Contagions: A Media Archaeology of Computer Viruses
by Jussi Parikka

List Price: $35.95
Paperback: 327 pages
Publisher: Peter Lang Publishing (June 2007)
ISBN-10: 0820488372
ISBN-13: 978-0820488370

Book Description: Digital Contagions is the first book to offer a comprehensive and critical analysis of the culture and history of the computer virus phenomenon. The book maps the anomalies of network culture from the angles of security concerns, the biopolitics of digital systems, and the aspirations for artificial life in software. The genealogy of network culture is approached from the standpoint of accidents that are endemic to the digital media ecology. Viruses, worms, and other software objects are not, then, seen merely from the perspective of anti-virus research or practical security concerns, but as cultural and historical expressions that traverse a non-linear field from fiction to technical media, from net art to politics of software. Jussi Parikka mobilizes an extensive array of source materials and intertwines them with an inventive new materialist cultural analysis. Digital Contagions draws from the cultural theories of Gilles Deleuze and Félix Guattari, Friedrich Kittler, and Paul Virilio, among others, and offers novel insights into historical media analysis.


October 29, 2007 in Internet, General | Permalink | Comments (0) | TrackBack

Twenty-five Developments That Changed the Internet

Take a walk down memory lane. See USA Today's Things That Changed the Internet. [JH]

October 19, 2007 in Internet, General | Permalink | Comments (0) | TrackBack