tech support 8

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Friday, 30 October 2009

Internet Turns 40, Just Might Catch On

Posted on 10:38 by Unknown

Media outlets seem to have settled on October 29 as the official birthday of the Internet. This date has been chosen because it's the day that Leonard Kleinrock at the University of California-Los Angeles sent a message over a two-computer network (the other end being a computer at Stanford Research Institute) with Charley Kline manning the UCLA keyboard and Bill Duvall on the Stanford site. It's worth noting that the computer carrying the first ever transmission on the Internet ("LOGIN") crashed after only two letters ("LO"). I believe that Kline actually typed an "L" for the third letter (instead of "G") and in a fit of future-sensing self-sacrifice, executed a core dump all over the floor.

Some may point out that on September 2, 1969, two computers were connected with a 15-foot cable and passed data back and forth. That was a precursor to the networking that happened a month later, but is not generally regarded as the birth of the Internet. Just as neither the first email message (1971) nor the first web browser (1993) are considered the birth of the Internet.

Given this historic day, there has been a lot of media coverage (some of it pretty bad, just like the average YouTube video) detailing some of the steps or milestones of the last 40 years. Some of the crunchy bits:

  • Web pioneer recalls 'birth of the Internet' - CNN interviews Leonard Kleinrock, responsible for hooking up those two computers
  • From the first email to the first YouTube video: a definitive internet history - Guardian article that covers the first email, first virus, the first smiley (ugh), but fails to mention the first spam.
  • Forty years of the internet: how the world changed for ever - also from the Guardian site
  • A people's history of the internet: from Arpanet in 1969 to today - Guardian timeline integrating reader stories and curious data bits.
  • 'Lo' And Behold: A Communication Revolution - NPR interviews Charley Kline for All Things Considered.

The opening image of this post is an Internet timeline (in extra large format so you can read it from the other room, or across the street) from Daily News LA article "How the Internet was born at UCLA."

Videos and Audio Bits

The All Things Considered broadcast:

A somewhat technical perspective of the time leading up to and after the birth of the Internet:

A video from 1993 by the CBC covering the "growing phenomenon of Internet" (covering mostly just Usenet):

The Web

For those who don't quite understand the relationship between the web and the Internet as a whole, the World Wide Web came much later. First as a proposal to CERN by Tim Berners-Lee in March of 1989 and then in the form of NCSA Mosaic in April of 1993 (yes, it was not the first web browser, but it was the first to get traction).

To qualify that a bit more, if anyone comes to you claiming 25 years of web experience (as one follower on Twitter recently did), you can send them away. The web is barely old enough to drive.

Update: There are no Al Gore jokes in here. This was intentional. Srsly. Then I'd have to link to a photo of Al Gore and nobody wants that.

Read More
Posted in internet | No comments

Thursday, 29 October 2009

Reminder: See Me Speak, Tues. Nov. 3

Posted on 07:44 by Unknown

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map

Read More
Posted in SEM, SEO, social media, speaking, standards | No comments

Wednesday, 28 October 2009

Google CEO Describes Web in 5 Years

Posted on 08:09 by Unknown

ReadWriteWeb posted an article (Google's Eric Schmidt on What the Web Will Look Like in 5 Years) highlighting some bits from Eric Schmidt's (Google CEO) talk at the Gartner Symposium/ITXpo Orlando 2009. ReadWriteWeb was even kind enough to post a 6 minute excerpt that they believe would be of interest to those of us on the web:

He has some specific ideas that stand out:

  • Citing Moore's Law and claiming that 5 years is a factor of ten increase in computing power, he sees far more powerful computers and broadband in excess of 100MB. This feeds his idea that TV, radio, the web (YouTube, etc.) will all become merged in some way.
  • The internet will be dominated by content in Chinese (although he's silent on the future of Great Firewall of China).
  • Real-time content and real-time search may further push traditional news sources to a tier below user-generated content.
  • Watch today's teenagers to see how they use the web today as they will drive how employees consume content in 5 years.

Catch the full 45 minute interview on YouTube: Eric Schmidt, CEO of Google, interviewed by Gartner analysts Whit Andrews and Hung LeHong in front of 5,000 CIOs and IT Directors at Gartner Symposium/ITxpo Orlando 2009.

Read More
Posted in Google | No comments

Tuesday, 27 October 2009

New Google Analytics Features

Posted on 11:45 by Unknown

In the article "Google Analytics Now More Powerful, Flexible and Intelligent" from last Tuesday (yes, I know I'm behind on this) on the Google Analytics Blog, the Analytics team has introduced some interesting new features. Some of the updates:

  • Two new goal types allow you to set thresholds for Time on Site and Pages per Visit. You can also define up to 20 goals per profile.
  • A server-side chunk of code is coming to allow tracking on moble sites (since mobile browsers may not be able to run the JavaScript Analytics code on a page).
  • Advanced table filtering allows you to filter rows based on assorted conditions. In their sample video they show ho to filter keywords to identify just the keywords with a bounce rate less than 30% that referred at least 25 visits (it's much easier to understand with this video).
  • You can now select unique visitors as a metric for custom reports.
  • You can now have multiple custom variables so you can track visitors according to visitor attributes, session attributes and page attributes.
  • You can now share custom reports and segments with other Analytics users.

There are two items over which the blog post is most excited:

Analytics Intelligence: We're launching the initial phase of an algorithmic driven Intelligence engine to Google Analytics. Analytics Intelligence will provide automatic alerts of significant changes in the data patterns of your site metrics and dimensions over daily, weekly and monthly periods. For instance, Intelligence could call out a 300% surge in visits from YouTube referrals last Tuesday or let you know bounce rates of visitors from Virginia dropped by 70% two weeks ago. Instead of you having to monitor reports and comb through data, Analytics Intelligence alerts you to the most significant information to pay attention to, saving you time and surfacing traffic insights that could affect your business. Now, you can spend your time actually taking action, instead of trying to figure out what needs to be done.
Custom Alerts make it possible for you to tell Google Analytics what to watch for. You can set daily, weekly, and monthly triggers on different dimensions & metrics, and be notified by email or right in the user interface when the changes actually occur. [....]
Read More
Posted in analytics, Google | No comments

Monday, 26 October 2009

R.I.P. Geocities

Posted on 08:49 by Unknown

In my post "Wait - GeoCities Still Exists?" I mentioned that on October 26 Geocities was going away. Well, that sad day is upon us. And if you didn't follow those Geocities links I posted, you are SOL now. However, more tributes have popped up on the web in honor of this historic (hysterical?) day.

  • XKCD themed their site today.
  • The irascible Bruce Lawson has created aGeocities template for CSS Zen Garden.
  • GeoCities tribute with lots of helpful comments at Reddit
  • Mashable links to group trying to download and archive all of Geocities.
Read More
Posted in | No comments

Friday, 23 October 2009

Usability Testing vs. Expert Reviews

Posted on 11:11 by Unknown

An article at UX Matters titled "Usability Testing Versus Expert Reviews" takes a reader question and tosses it to a series of experts to answer:

Under what circumstances is it more appropriate to do usability testing versus an expert review? What are the benefits and weaknesses of each method that make one or the other more appropriate in different situations?

The experts ultimately all came up with similar answers — do both. Start with the expert review to take care of low-hanging fruit and then bring users in for the testing phase to catch the issues that trip them up. Some quotes:

Expert reviews are especially useful for finding violations of usability standards and best practices. These are often obvious problems that may or may not cause problems during usability testing.
Before doing usability testing, it is helpful to do at least an informal expert review to determine what to focus on during testing.
I recommend always doing both. Expert reviews that are performed by specialists, using standards and heuristics, reveal easy-to-catch usability problems in a very cost-efficient way.
While usability testing is more powerful than expert review, both methods in combination are great, because you first want to discover the low-hanging fruit and get them out of the way.
We need to remember that expert review is a user-free method. Regardless of the evaluators' skill and experience, they remain surrogate users—expert evaluators who emulate users—and not typical users. The results of expert review are not actual, primary user data and should lead to—not replace—user research.

There aren't any real surprises here, but it's interesting to see the different approaches suggested by each expert.

Read More
Posted in usability, UX | No comments

Thursday, 22 October 2009

Bing and Google Add Social Search

Posted on 09:30 by Unknown

Google and Bing have been locked in a struggle recently for search engine dominance. Bing came out of the gates fast and gained a lot of market share, but has appeared to level off recently (another link, and another link). Neither of them wants to lose any ground. Factor in the recent explosion of Twitter and other near-real-time social media outlets on the web and people's desire to search them all, and you have two search giants salivating over new opportunity.

Both Microsoft and Google announced partnerships with Twitter yesterday. There had been fears of one making an investment in Twitter and locking the other out from search results, but those fears appear to have been assuaged. At least for now. Consider that Microsoft already has a sizable cash investment in Facebook ($240 million, or 1.6% of Facebook's valuation at the time) giving them a leg up over Google on searching within Facebook. It seemed that the same thing might happen with Twitter (a company that just closed a deal for another $100 million in funding, pushing its value to ~$1 billion).

Google made its announcement at the Web 2.0 Expo, showing off its Social Search feature from Google Labs. The same presenter, Google's VP of Search Products and User Experience, then posted this to the Google blog:

...[W]e are very excited to announce that we have reached an agreement with Twitter to include their updates in our search results. We believe that our search results and user experience will greatly benefit from the inclusion of this up-to-the-minute data, and we look forward to having a product that showcases how tweets can make search better in the coming months.

Mashable has some more detail on what was discussed at the in-person announcement, including some information on linking your social media profiles in to Google to allow searching across them as well

Microsoft signed two deals on this same day, one with Twitter to bring its tweets to Bing, and one with Facebook to bring status updates to the search engine (another link). The Bing blog reports its Twitter search:

...[T]oday at Web 2.0 we announced that working with those clever birds over at Twitter, we now have access to the entire public Twitter feed and have a beta of Bing Twitter search for you to play with (in the US, for now).

Essentially the Twitter search has been rebuilt within Bing, including the real-time updates. Bing also includes relevancy, based on things like number of retweets, keywords and the quality of the tweets (who decides this?). Bing also shows the trending topics as a tag cloud. In case you haven't tried it yet, head on over to Bing.com/twitter

The next few months should be interesting as both Bing and Google tweak and enhance they new offerings. Given their same-day announcements, we can expect both teams to be watching the other closely and responding to changes quickly.

In the meantime I'll be scratching me head on why Twitter, a place solely for me and millions of others to spout our own brand of crazy in real-time, is valued at $1 billion.

UPDATE: Read this swell article, just posted: Social Search: Customers Influence Search Results Over Brands.

Read More
Posted in Bing, Facebook, Google, Microsoft, social media, Twitter | No comments

Wednesday, 21 October 2009

Firefox 3.6 to Support Web Open Font Format

Posted on 14:34 by Unknown

Mozilla's developer blog today posted that they have added support for the Web Open Font Format (WOFF) to Firefox 3.6. Firefox 3.5 gave us support for linking to TrueType and OpenType fonts, but this takes it a step further to support a format that is more robust for two key reasons:

  1. WOFF is a compressed format, resulting in files smaller than an equivalent TrueType or OpenType font.
  2. It avoids DRM and domain labels by containing meta data about its source, garnering support from typeface designers and foundries.

If you want to see this in action, you'll need to grab a Firefox 3.6 nightly build until the full release is out the door. If you should feel compelled to do that, the nice folks at Mozilla have provided a sample block of CSS that uses the @font-face rule to link to a WOFF font:

@font-face {
font-family: GentiumTest;
src: url(fonts/GenR102.woff) format("woff"),
url(fonts/GenR102.ttf) format("truetype");
}

body {
font-family: GentiumTest, Times, Times New Roman, serif;
}
Structured this way, browsers that support the WOFF format will download the WOFF file. Other browsers that support @font-face but don’t yet support the WOFF format will use the TrueType version. (Note: IE support is a bit trickier, as discussed below). As WOFF is adopted more widely the need to include links to multiple font formats will diminish.

If you are fortunate enough to have a build of Firefox 3.6 already up and running on your machine, go to a test page using ff Meta set up by the nice folks at edenspiekermann (if part of the name is familiar, Erik Spiekerman is the designer of the Meta family, and if the typeface is familiar, it's what we use at Algonquin Studios). The image below shows how that page looks in Firefox 3.6 using ff Meta (left side) and how it looks rendered in Internet Explorer 8 (right side).

Screen shot showing page with ff Meta typeface on one half, not on the other.

Because IE8 only supports the EOT format, the blog offers some code to account for IE8 in the CSS. Because IE8 doesn't understand the format hints, it will parse the hints as part of the URL, resulting in requests to the server for files that don't exist. The end user will see things just fine because of the EOT reference, but your logs will show some odd 404s as a result of this technique. The Mozilla post has more details on this and some other issues. The code to do this:

@font-face {
font-family: GentiumTest;
src: url(fonts/GenR102.eot); /* for IE */
}

@font-face {
font-family: GentiumTest;
/* Works only in WOFF-enabled browsers */
src: url(fonts/GenR102.woff) format("woff");
}

The main Mozilla blog has a post today listing the supporting organizations with the following endorsement:

We endorse the WOFF specification, with default same-origin loading restrictions, as a Web font format, and expect to license fonts for Web use in this format.

Updates

  • font_dragr: a drag and drop preview tool for fonts
  • after Firefox 3.6 – new font control features for designers
Read More
Posted in css, Firefox, fonts, Internet Explorer, standards, typefaces, WOFF | No comments

Tuesday, 20 October 2009

"Myth of Usability Testing" at ALA

Posted on 07:22 by Unknown

There is a very good article over at A List Apart today titled "The Myth of Usability Testing." The article starts off with an example of how multiple usability evaluation teams, given the same task and allowed to run at it as they saw fit, had far less overlap in found issues than one would hope.

The author goes on to explain why usability evaluation is unreliable with a series of examples (which seem painfully obvious, and yet these mistakes keep happening) broken into two main categories:

  • "Right questions, wrong people, and vice versa."
    Using existing users to evaluate a site is a loaded approach, they already have expectations set by the current site that taint their ability to see other options. Conversely, asking new users to complete tasks driven by an existing design is not a good way to evaluate new approaches.
  • "Testing and evaluation is useless without context."
    It is common for me to hear the statement that "nothing can be more than two clicks from the home page," but this ignores the real context of the site and its users. These blanket statements or goals can harm an evaluation when instead a test should start with an understanding of user goals, success metrics, and real site goals.

From here the article outlines what usability testing is actually good for, and then helps focus the reader on the reality of the testing process and its results. I'm glossing over the other 2/3 of the article partly because I wanted to draw attention to the bits above and partly because you should just go read it already. There are some good links in the article for tools that can help identify trouble spots and support an evaluation.

Read More
Posted in design, usability, UX | No comments

Sunday, 18 October 2009

Current CSS3, HTML5 Support

Posted on 07:34 by Unknown

The Tool

Last week saw the launch of FindMeByIp.com, a very handy web site that displays a user's current IP address (along with a geographic breakdown to city, if possible), user agent string (browser, version and operating system) and support for CSS3 and HTML5 (read the article about it). It accomplishes this last bit by using the Modernizr JavaScript library to test a series of features in your browser:

  • @font-face
  • Canvas
  • Canvas Text
  • HTML5 Audio
  • HTML5 Video
  • rgba()
  • hsla()
  • border-image
  • border-radius
  • box-shadow
  • Multiple backgrounds
  • opacity
  • CSS Animations
  • CSS Columns
  • CSS Gradients
  • CSS Reflections
  • CSS 2D Transforms
  • CSS 3D Transforms
  • CSS Transitions
  • Geolocation API

If you are running the Google Chrome for Internet Explorer plug-in, it will show up in your browser user agent string on this page. However, it will also report your Internet Explorer browser as Chrome.

The Results

The site developers ran their own browsers through this tool and that collection of information has been gathered up and posted to provide information on current browser support for the latest standards (or recommendations). Deep Blue Sky, one of the developers of this site, has written up the level of support in all the A-Grade browsers. Yahoo's model outlines A-Grade browsers as "... identified, capable, modern and common" and claims "approximately 96% of our audience enjoys an A-grade experience." This is their codified way of simply saying, "the latest common browsers." If you read my post, Browser Performance Chart, then you'll see the same browsers. The article, Browser support for CSS3 and HTML5, covers the following browsers (the author didn't test against anything other than Windows versions):

  • Safari 4 (Win)
  • Firefox 3.5 (Win)
  • Google Chrome (Win)
  • Opera 10 (Win)
  • Internet Explorer 6, 7 & 8

The surprise in here is that Safari 4 outshone (outshined? outshinerified?) the others with Google Chrome coming up close behind. Firefox 3.5 was missing some key CSS3 support and Opera 10 was surprisingly lacking in support. As expected, Internet Explorer brings up the rear, lacking in everything but @font-face support (even in IE8) which has actually been there since version 6 (but only for the .eot format).

Read More
Posted in browser, Chrome, css, Firefox, html, Internet Explorer, Opera, Safari, standards | No comments

Saturday, 17 October 2009

Personas in Comic Format

Posted on 13:02 by Unknown

For developers, and clients, struggling with the concept of personas, there is a very easy to read primer in the form of a comic over at the ThinkVitamin blog in an article titled "How to Understand Your Users with Personas."

The concept of personas was first introduced in the book The Inmates Are Running the Asylum (1998) as tool for interaction design. In short (very short), personas are intended to help you (or your client) understand the needs of your users — their goals, experience, objectives, perspectives, etc. If I tell you too much, the comic won't really be worth reading, so go read it already.


It's like they're in my head.

Read More
Posted in design, usability | No comments

Friday, 16 October 2009

Browser Performance Chart

Posted on 09:02 by Unknown

Jacob Gube has posted a handy chart over at Six Revisions titled "Performance Comparison of Major Web Browsers." He tests the current versions of five browsers:

  • Mozilla Firefox 3.5
  • Google Chrome 3.0
  • Microsoft Internet Explorer 8.0
  • Opera 10.0
  • Apple Safari 4.0

In his tests he used the following performance indicators, tested three times each with an unprimed cache, and averaged the results:

  • JavaScript speed
  • average CPU usage under stress
  • DOM selection
  • CSS rendering speed
  • page load time
  • browser cache performance

When Google Chrome 3 was released in September, Google claimed a 150% increase in JavaScript performance from its previous version but didn't offer comparisons to other browsers. Opera also made improvement claims in its September release of Opera version 10, specifically a "40% faster engine." Those claims are also not made in comparison to other browsers. The overall performance ranking he assigns to the browsers isn't too much of a surprise, with Google Chrome as the best and Internet Explorer 8 as the worst. If he ran as many persistent tabs in Firefox as I do, it might not hold on to the #2 slot so easily. The ranking:

  1. Google Chrome 3.0
  2. Mozilla Firefox 3.5 (tied with Safari)
  3. Safari (tied with Firefox)
  4. Apple Safari 4.0
  5. Microsoft Internet Explorer 8.0

Use the chart below to link to the full size version. You can also download the raw data in CSV format from the site in case you want to try your own hand at colorful charts.

Small view of browser performance comparison chart.

Read More
Posted in browser, Chrome, Firefox, Internet Explorer, Opera, Safari | No comments

Thursday, 15 October 2009

Developer Discusses Dyslexia and Dyscalculia

Posted on 10:40 by Unknown

Sabrina Dent, a web designer hailing from Ireland, has blogged about her struggle with dyslexia and dyscalculia and web applications today in the post, "Dyslexia, Dyscalculia and Design". For some context, she links to the Wikipedia article on dyscalculia and highlights the bits that apply to her:

  • An inability to read a sequence of numbers, or transposing them when repeated, such as turning 56 into 65.
  • Problems with differentiating between left and right.
  • Difficulty with everyday tasks like checking change and reading analogue clocks.

Sabrina discusses her experience with the examples of login screens (specifically the steps that require more detailed information than a username and password), phone numbers, and booking calendars.

It's a brief post, but it's insightful. As a web designer she understands the motivation for these types of interfaces, but that doesn't mean they are easy for her to use.

Read More
Posted in accessibility, design, usability, UX | No comments

Wednesday, 14 October 2009

Derek Powazek on SEO as Snake Oil

Posted on 10:12 by Unknown

There are many on the web who will recognize the name Derek Powazek. He is the name behind old-school sites such as Fray.com and Kvetch.com (which has apparently been taken over by spam bots) and wrote a book about communities (Design for Community, which mentions me by name, which is awesome). I also had the pleasure to meet him at SXSW back in 2001 and even participate in his Fray Cafe. So when I saw his blog post on SEO that started off with this statement, I was pleased:

Search Engine Optimization is not a legitimate form of marketing. It should not be undertaken by people with brains or souls. If someone charges you for SEO, you have been conned.

What pleases me more is that it echoes a comment I made in my post Verified: Google Ignores Meta Keywords back in September:

Those of us trying to protect our clients from SEO/SEM snake-oil salesmen are happy to finally have an official statement from Google.

Now that I've tooted my horn and compared myself to someone considered one of the top 40 "industry influencers" of 2007 by Folio Magazine, let me get to my point. I've been working on the web since Hot Java was still a browser, was excited when the first beta of Netscape Navigator made its way to world, when Yahoo were a couple of guys in a dorm posting links, when my Jolt Cola web site was included in their index because I asked them to include it, and since then the way people find things on the web has changed dramatically. For the last decade or so the search engine has become more than a convenience, it's a necessary feature of the web, without which we'd be stuck wandering billions of terrible pages of things we don't want to see (many thousand fewer of those pages once GeoCities finally closes down). Because of this, getting your site into the search engine in the top spot has become the holy grail of online marketing, one that far too many people are happy to exploit as an opportunity.

Derek makes two key points in his article

  1. The good advice is obvious, the rest doesn’t work.
  2. SEO is poisoning the web.

He supports the first point by noting that formatting, structure, summaries, quality links and so on have worked since the beginning and will continue to work. There's no magic there. It's free to read anywhere on the web. For his second point he references all the Google bombing tactics that are employed by bots to spam blogs, comment areas, twitter accounts, parked domains, etc. as well as questionable tactics that exploit loopholes (albeit temporary ones) in a search engine's ranking algorithm.

As of now the article has 180 comments, many of which are optimizers who take umbrage with the blanket statement that SEO is the work of the soulless foulspawn and their dark arts (my words, but I think I summarize his sentiment well enough). After receiving so many comments Derek added a post yesterday, his SEO FAQ, responding to a generalization of many of the questions and comments. He also offers some suggestions, including this one targeted at clients (I just took the first part):

If someone approaches you about optimizing your search engine placement, they're running a scam. Ignore them.

Having said something similar in the past to clients, this is normally where I'd segue into a discussion with my clients about how I've worked hard to ensure Algonquin Studios' content management system, QuantumCMS, adheres to best practices and provides many ways to get quality content into the pages, links, titles, page addresses, meta information (after I tell them Google doesn't use meta data for ranking but they insist because they've been conditioned to think that way) and so on. This is also the part where I remind clients that their help is needed to write that copy, interact with users, customers, partners, industry organizations, etc. to generate quality relationships and references (often in the form of links), and to plan to spend time working on this regularly to keep it fresh and relevant.

I look forward to the time when I won't be spending chunks of my day clearing spambots from my QuantumCMS Community forum, batting down spam email about submissions to 300+ search engines, ignoring bit.lys in unsolicited Twitter @ responses, and generally fighting the after effects of all the black hat SEO enabling we've allowed for years.

Read More
Posted in rant, search, SEM, SEO | No comments

Tuesday, 13 October 2009

Come See Me: November 3

Posted on 13:32 by Unknown

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map

Read More
Posted in SEM, SEO, social media, speaking, standards | No comments

Friday, 9 October 2009

Wait - GeoCities Still Exists?

Posted on 13:34 by Unknown

October 26, 2009 marks the end of an era. GeoCities, the much-maligned free hosting service offered since way back in the early days of the web, is closing down. Like Abe Vigoda, there are many who already thought it had long since fizzled out.

Dating back to 1995, GeoCities provided users with free web hosting accounts using their "neighborhoods" model and was getting over six million monthly page views. In 1997, GeoCities introduced the now infamous ads on its pages and still became the 5th most popular site on the web. At its peak as the 3rd top spot on the web in 1999 (behind AOL and Yahoo), Yahoo paid $3.65 (3.57 according to Wikipedia) billion for GeoCities. That's in billions. Some of us may remember the furor that took place when Yahoo laid claim to all rights over content (copy, images, etc.), which Yahoo eventually reversed. And then there was the dot-com bust. Over time GeoCities became synonymous with terrible, tacky web pages made up of animated graphics, under construction icons, background music, and browser download badges, among other horrors of the bygone days of the web.

Given Yahoo's recent cost-cutting measures it makes sense that the free service would be one of their offerings to go. Yahoo announced the closure on April 23 of this year, but yesterday Yahoo sent out a final notice warning to users explaining that they can move their sites to Yahoo's $4.99/month hosting service and to also warning users that on October 26 they are flat out deleting Geocities. All of it. With no chance of recovery. Period. So there.

Sadly, I don't know what I'll do without some of these GeoCities standbys, which haven't been updated in nearly a decade. That's 10 years. I carry on conversations with IRC bots younger than that.

  • The Worst Simpsons Site Ever
  • All-Wright Site (1,287,346 hits since 04/03/96)
  • Dr. Doo Wop
  • Ska*Anarchy
  • GeoCities tribute with lots of helpful comments at Reddit

At least we still have Angelfire to deride...

Read More
Posted in | No comments

Thursday, 8 October 2009

October 6 Panel Follow-up

Posted on 10:47 by Unknown

For those of you who attended the Business First Power Breakfast: Online Networks this past Tuesday, October 6 and have reached out to me to offer feedback and ask questions, thanks for your interest and thanks for coming. I was thrilled to see a room full of keenly interested attendees.

Based on some of the questions I have received since then I'd like to make a few comments.

First, the Social Media Revolution video I posted Tuesday afternoon is not mine. I had nothing to do with it, it was playing when I got there. It's a great overview of the growth of social media and other online technologies, but it isn't mine, I just embedded it so you could view it here in case you forgot the title.

Second, some of you realized, in retrospect, that the guy you saw taking photos of the food with his cell phone camera was me. I am surprised how many people caught on to my comments about food photos and I know some of you went looking for the photos on my Twitter feed. Unfortunately, because I use Brightkite for my photos (which posts to Twitter) and because Brightkite went down that morning for a massive upgrade (and still hasn't come back up), those photos never made it out. So I've dropped smaller versions of them below.

Read More
Posted in social media, speaking | No comments

Tuesday, 6 October 2009

Social Media Revolution Video

Posted on 09:24 by Unknown

If you attended the Business First Power Breakfast: Online Networks event this morning where I was a panelist, then you saw the Social Media Revolution video at the beginning and end of the panel. In case you didn't catch the address or title, here's the video.

If you were at the event this morning and came here as a result of hearing me speak, thanks for coming. If you weren't at the event this morning then you had better get me in to speak to your organization already.

Read More
Posted in social media, speaking | No comments

Monday, 5 October 2009

List of URL Shorteners Grows Shortener

Posted on 16:16 by Unknown

One of the many URL shorteners has announced that it is shutting down in just under three weeks. Cli.gs has announced, via its blog, that it will no longer accept new URLs to be shortened as of October 25. It will also stop logging analytics. Cli.gs will still forward URLs at through November, but no guarantees have been made beyond that. Since Cli.gs has been a one-man show serving tens of millions of forwards a month and the site owner is funding it from his own pocket, it's not too much of a surprise

Back in early August another URL shortener called it quits, Tr.im. The argument Tr.im posted on its page was essentially that there is no way to monetize URL shortening and without money to be made, it's hard to justify further development. Tr.im committed to supporting their Tr.im links through December 31, 2009. However, shortly after that post, Tr.im claimed to be resurrected. Since that announcement, Tr.im has started its progression into an open-source community owned endeavor. Its success is now at the mercy of how much time developers are willing to donate.

Both URL shorteners have, either directly or indirectly, referenced the lack of revenue from their business model, the number of competitors, and the fact that Twitter has standardized on Bit.ly, making it the de facto leader in the Twitterverse, and by extension, much of social media.

URL shorteners work by taking any URL that a user provides and returning a much shorter address that users can paste into a tweet (renowned for the 140 character limit), email messages, or other places where a terribly long link might be cumbersome. Some of them just redirect users right to the target page while some, like the not-so-short Linky.com.au, load the target page into a frame with their brand (and statistics and other link information) sitting at the top of the viewport. In the future, this brand area could be used to serve ads, possibly generating revenue.

The major complaint with URL shorteners is the fact that users cannot always see where the link will take them, which can be a boon for all those fake Twitter accounts trying to link users to porn (sorry, pr0n). This has led to browser add-ons or Twitter applications that can extend the URL as a tool-tip before clicking and being surprised by a screen (and speakers) full of NSFW content.

My additional complaint is that URL shorteners, and specifically their demise, leads to link rot on the web. If any of these services were to shut its doors tomorrow and stop all redirections, the links using those services all become dead-ends. In the case of tweets, we're talking about millions and millions of tweets that will become nonsense (moreso than many of them are now). For emails, articles, or anything else that relies on these URL shorteners, they also become orphaned references without context. If the Mayans had it right, that could be the End of Days we're all expecting in 2012.

Even if you don't believe in the great Linkpocalypse, you should take a few moments to read up on link rot at Wikipedia (an article which is sadly devoid of any mention of URL shorteners).

Read More
Posted in internet, social media, standards | No comments

Friday, 2 October 2009

Facebook and Google Want to Translate Your Site

Posted on 08:30 by Unknown

Translations for Facebook Connect

Earlier this week Facebook announced a new service built on the Facebook Connect API called Translations for Facebook Connect. In general, the idea behind this tool is to allow developers the ability to translate a web site (into a language currently supported by Facebook, 65+ right now,) by crowdsourcing the translation itself. Leaning on their own experience letting users translate the Facebook user interface, Facebook is essentially opening its translation workflow process to the world.

From the How It Works portion of the announcement:

After you choose what languages you want your site or application to support, you can get help from the Facebook community to translate your site, as we did, or you can do the translation yourself, or make a specific person the administrator of the process. [...] Once you register content for translation, your connected Facebook users can start translating your sites' content just as users helped translate Facebook.

Facebook has also created a new client-side featureset, using their XFBML framework, to allow developers to automatically submit content wrapped in a fb:intl tag for translation and to allow translators to translate the content inline. Facebook has put together a simple demo to show this in action.

The Facebook Developer Wiki article on internationalization outlines the process from start to finish. Here you can see that the process isn't just one of machine translation, but that a site owner must designate content strings (pages, words, phrases, UI elements, images, etc.) that are to be translated into a language selected by the site owner. The article also provides samples of how a translator might perform bulk translation or inline translation with screenshots showing orange underlines and menus that appear on a right-click to perform the translations.

The most important part of this tool, however, isn't the technology but instead a fundamental understanding of the process of translation. Facebook has a pretty good article on Internationalization Best Practices that at least presents a primer to those who have never been through this process, gleaned from Facebook's own experience translating its site. This addresses, albeit without a lot of detail, some of the pitfalls those of us in the world of localization have experienced, such as the tendency for translated phrases to contain more characters than the original phrase (which can mess with a pixel-precise layout), or choosing general phrases to leverage throughout the site instead of translating similar but different bits of words over and over.

Caveats

Unlike machine translation, humans are performing the translations, greatly increasing the likelihood that they will understand context. However, there are many conditions that must be satisfied or assumptions that must be made to implement this set of tools...

Does the web site attract users who are fluent in another language (the desired language)? If the site isn't already in the native tongue of a user, why would he/she be there?

Are these users willing to help translate your site for free? Given the concept of "you get what you pay for," there has to be a consideration for the quality and skill of the translators who take it upon themselves to do this work.

Is the translator a subject matter expert? You may not want to have just anyone translate your site about widgets. Widgets may require some very technical language that a casual reader may not grasp and may not understand in context. Words that are mundane in daily use may have a very specific meaning in the world of widgets and might not suffer a loose translation well (or perhaps aren't even supposed to be translated).

Is the translator part of your target audience? It's one thing to understand Portugese because your are from Portugal. It's another thing entirely to translate into Portuguese for use by Brazilians. Understanding the region, dialect, and local idioms can be very important for a proper translation (which is why we call it localization).

Are the site owners comfortable letting unknown third parties translate their message? Tag lines, marketing lingo and other carefully crafted content usually doesn't translate easily. Even with an approval process in place, it may take many passes to get site owners confident with a brands translation in language they do not speak.

Does the site already have Facebook Connect enabled? If not, then time must be taken to add the appropriate code to each page of the site to be translated.

Is time budgeted to identify content for translation? Somebody has to walk through the entire site (or at least any pages to be translated) and mark up the content for translation using what may be arcane XFBML tags.

Does the user even allow Facebook Connect? If your ideal translator doesn't want to log in to Facebook Connect on your site, then all this is moot. The user/translator has to trust both Facebook and your site, and feel altruistic enough to want to help.

Google Web Site Translator Gadget

The day after Facebook announced its new translation tool, Google reminded us that it has been doing this for a while by announcing its Web Site Translator Gadget. The Google widget works by providing a web site developer with a block of HTML and JavaScript to drop on the page of an existing site. This code draws the menu that allows the site to be translated. You can grab the code at the Google Translate Tools page. It currently supports 51 languages.

The Google Translator Toolkit powers this machine translation tool, and while Google states that the toolkit learns from corrections to translations provided by users, it's still just machine translation without human intervention. It's also far easier to implement on a site and doesn't require much overhead. As an aid to the user, when the mouse hovers over the translated content, the block of copy is highlighted and the original content is displayed using a "tooltip" (or the title attribute of the element).


The Google Translate human intervention cycle

Because Google Translate has been around for so long, many users are accustomed to the quirks and know better than to rely on it to translate marketing content or poetic prose. Where it excels is in quickly allowing a site visitor to get the gist of a page, understand an address, or otherwise grab content that isn't so intertwined with a need for context that a user can glean some use.

Please understand that I am not referring to the "Translate" feature that was added to the Google Toolbar for Firefox as of yesterday (October 1). That is a client-side function that operates independently of the web site owner and I am only addressing translation features that a web site owner might want to implement to translate his/her site.

Caveats

Machine translation is a risky endeavor. Even the most robust natural language processors have trouble with elements of language that humans understand naturally, such as context, ambiguity, syntactic irregularity, multiple meanings of words, etc. An example often used in this discussion is evident in these sentences:

Time flies like an arrow.
Fruit flies like an apple.

The words "flies" and "like" have completely different meaning between sentences. Only our knowledge of the metaphor and the bug allow us to understand the differences without further context. As my brand manager for me, I took the introductory paragraph from my web site (excuse my ego, in case you hadn't noticed it already):

I am a founder, partner, and Senior Usability Engineer at Algonquin Studios, responsible for bridging the gap between the worlds of design and technology. With experience in both, I bring a unique perspective to projects, allowing both design and implementation to merge seamlessly.

I translated it into German and then back to English:

I am a founder, partner and senior usability engineer at AlgonquinStudios, responsible for bridging the gap between the world of design and technology. With experience in both, I bring a unique perspective on projects, up so that both design and implementation to proceed smoothly integrated.

We can see it does pretty well for the first half of the paragraph and then it falls apart. This experiment is inherently unfair, of course, but it does demonstrate some of the risks with machine translation. It becomes particularly problematic when proper names that correspond to common words are translated.

Which to Use?

Machine translation is a risky proposition at best. You have no control over what content will come back to your end user. Adding a Google Translate feature to your site can give the appearance to some users that you have effectively signed off on the content. If that's not a concern, perhaps because of the nature of your site (fan site, personal site, etc.) then it's certainly your cheapest option. If you have a business site then this may not be the right path for you. You may still want to link to the Google Translate page to let end users perform translations on their own. At that point you have set an expectation that this is not a serice you provide and you are not responsible for how bizarrely an idiom may be translated.

Crowdsourcing your translation may sound like a better idea simply because you now have humans making decisions about what makes sense in the translation, but you have to take into account the caveats above. Are you prepared to pay for (or spend) the time preparing a site for translation and then babysitting the process, hoping the entire time someone with enough linguistic skill will come along and translate it?

This is where I would lean on a third option. Hire professionals who have done this before. If you are talking about translating your corporate brand and message, then you shouldn't leave it to machines or the waiting game of altruism. If you just want to translate a personal site, a fan site, or perhaps a strong community site (like Facebook), then either other option may work well for you depending on what you can commit. It may even be worth considering Google for the short term option while waiting for the Facebook option to pay off.

Read More
Posted in Facebook, g11n, globalization, Google, i18n, internationalization, L10n, localization, translation | No comments

Thursday, 1 October 2009

Come See Me: October 6

Posted on 08:23 by Unknown

I will be one of the panelists at the Business First Power Breakfast: Online Networks, this coming Tuesday, October 6, 2009 at 7:30am at Salvatore's Italian Gardens, 6461 Transit Rd., Depew, NY 14043 (map below).

From Business First:

ONLINE NETWORKS - Facebook * Twitter * LinkedIn * YouTube - MAKE THEM WORK FOR YOU!

Are you the social type? Are you networked? Join us for breakfast to learn how to advance your career, products and services through online social networks.

Our panel of experts will help you learn:

  • the distinct differences between the top online social networks
  • to strategically build your brand
  • how to grab people's attention and creat buzz online
  • potential pitfalls- and how to avoid them

The Power Breakfast panel of experts will first answer questions posed to them by Business First. This will be followed by questions submitted by members of the audience.

Panelists:

  • Brett Burnsworth; Zoodle Marketing
  • Brad London; RescuedSites.com
  • Mary Beth Popp; Eric Mower and Associates
  • Adrian Roselli; Algonquin Studios

Read up on it and register at the Business First site.

View Larger Map

Read More
Posted in social media, speaking | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Browser Performance Chart
    Jacob Gube has posted a handy chart over at Six Revisions titled " Performance Comparison of Major Web Browsers ." He tests the c...
  • Google Dashboard: What Google Knows about You
    Google announced a new service/feature today, Google Dashboard . Given all the services Google offers and all the ways you can interact with...
  • Facebook, HTML5, and Mis-Reporting
    My Twitter stream and the headlines of sites across the web yesterday lit up with Facebook's CEO blaming its stock price (failure to mee...
  • App Store Meta Tags
    Why yes, Dominos, I'd love to tap again to get your real home page to order a pizza when I could have done it right here, below your ove...
  • Speaking at Mom 2.0 in Houston, TX
    I will be in Houston this week to speak at the Mom 2.0 Summit (Feb. 18-20, 2010, Houston, TX). To make it a little easier to describe, here...
  • Codepen Has Handy Sharing Tools for Devs
    There are plenty of online resources for playing around with code right in the browser, no server of your own needed, that you can then shar...
  • History of Eye-Tracking as Research Tool
    If you've ever wondered what eye-tracking is and where it came from, there is a historical breakdown in the article A Brief History of E...
  • Opera: Presto! It's now WebKit
    Opera is replacing its Presto rendering engine with WebKit (Chromium, really, when you factor in the V8 JavaScript rendering engine). Big n...
  • The Science of Trust in Social Media
    I am one of those people who always needs to see proof of some assertion, evidence to back up a claim. While I can accept anecdotal evidence...
  • Developer Discusses Dyslexia and Dyscalculia
    Sabrina Dent , a web designer hailing from Ireland, has blogged about her struggle with dyslexia and dyscalculia and web applications today...

Categories

  • accessibility
  • Adobe
  • analytics
  • Apple
  • apps
  • ARIA
  • Bing
  • Blink
  • Brightkite
  • browser
  • Buzz
  • Chrome
  • clients
  • css
  • design
  • Facebook
  • Firefox
  • Flash
  • fonts
  • food
  • Foursquare
  • g11n
  • geolocation
  • globalization
  • Google
  • Gowalla
  • html
  • i18n
  • ICANN
  • infographic
  • Instagram
  • internationalization
  • internet
  • Internet Explorer
  • JavaScript
  • JAWS
  • Klout
  • L10n
  • law
  • localization
  • Lynx
  • Mapquest
  • Microsoft
  • mobile
  • Netscape
  • ning
  • Opera
  • patents
  • picplz
  • Plus
  • print
  • privacy
  • project management
  • QR
  • rant
  • RSS
  • Safari
  • SCVNGR
  • search
  • SEM
  • SEO
  • social media
  • Sony
  • speaking
  • standards
  • SVG
  • touch
  • translation
  • Twitter
  • typefaces
  • usability
  • UX
  • Verizon
  • video
  • W3C
  • WAI
  • WCAG
  • WebKit
  • whatwg
  • Wired
  • WOFF
  • xhtml
  • Yahoo
  • YouTube

Blog Archive

  • ►  2013 (39)
    • ►  December (1)
    • ►  November (7)
    • ►  September (4)
    • ►  July (3)
    • ►  June (2)
    • ►  May (5)
    • ►  April (3)
    • ►  March (6)
    • ►  February (2)
    • ►  January (6)
  • ►  2012 (63)
    • ►  December (2)
    • ►  November (4)
    • ►  October (5)
    • ►  September (5)
    • ►  August (4)
    • ►  July (6)
    • ►  June (7)
    • ►  May (7)
    • ►  April (8)
    • ►  March (5)
    • ►  February (3)
    • ►  January (7)
  • ►  2011 (67)
    • ►  December (5)
    • ►  November (7)
    • ►  October (5)
    • ►  September (4)
    • ►  August (8)
    • ►  July (3)
    • ►  June (8)
    • ►  May (3)
    • ►  April (1)
    • ►  March (6)
    • ►  February (6)
    • ►  January (11)
  • ►  2010 (100)
    • ►  December (8)
    • ►  November (7)
    • ►  October (5)
    • ►  September (10)
    • ►  August (7)
    • ►  July (11)
    • ►  June (12)
    • ►  May (6)
    • ►  April (8)
    • ►  March (10)
    • ►  February (5)
    • ►  January (11)
  • ▼  2009 (51)
    • ►  December (9)
    • ►  November (6)
    • ▼  October (21)
      • Internet Turns 40, Just Might Catch On
      • Reminder: See Me Speak, Tues. Nov. 3
      • Google CEO Describes Web in 5 Years
      • New Google Analytics Features
      • R.I.P. Geocities
      • Usability Testing vs. Expert Reviews
      • Bing and Google Add Social Search
      • Firefox 3.6 to Support Web Open Font Format
      • "Myth of Usability Testing" at ALA
      • Current CSS3, HTML5 Support
      • Personas in Comic Format
      • Browser Performance Chart
      • Developer Discusses Dyslexia and Dyscalculia
      • Derek Powazek on SEO as Snake Oil
      • Come See Me: November 3
      • Wait - GeoCities Still Exists?
      • October 6 Panel Follow-up
      • Social Media Revolution Video
      • List of URL Shorteners Grows Shortener
      • Facebook and Google Want to Translate Your Site
      • Come See Me: October 6
    • ►  September (13)
    • ►  August (2)
  • ►  2003 (3)
    • ►  October (1)
    • ►  January (2)
  • ►  2002 (9)
    • ►  December (1)
    • ►  June (3)
    • ►  April (1)
    • ►  March (3)
    • ►  January (1)
  • ►  2001 (1)
    • ►  February (1)
  • ►  2000 (4)
    • ►  October (1)
    • ►  July (1)
    • ►  June (1)
    • ►  January (1)
  • ►  1999 (7)
    • ►  November (1)
    • ►  September (2)
    • ►  August (2)
    • ►  July (1)
    • ►  June (1)
Powered by Blogger.

About Me

Unknown
View my complete profile