by Whit Andrews | June 12, 2014 | Comments Off
Very few people click the button that lets them know why a song gets played on Pandora. I don’t know how few, but I know it’s probably only marginally better than the proportion who have ever clicked the “Advanced Search” link on any search engine, which most search vendors will tell you is pretty much the dude who engineered it and his roommate.
But in enterprises and privacy, transparency and the promise of transparency are fundamental. For users who love to hate the assumptions that algorithms make, and for users who love tune them, the ability to see how they work is extremely important. In search, we call it the “What the Heck” button. (Actually, I am slightly saltier in known company, but you get the drift.)
My picture; Brooklyn last December.
So Facebook’s decision to peel the skin of its sometimes imbecilic ad brain is very smart. Power users can direct themselves to fiddling with knobs and dials instead of complaining. (That would be me.) Privacy advocates will have less to complain about. And the ads might get better (although any search vendor will tell you that what people do want is more important than what they think they want).
Consumerization means this will make transparency more important in enterprise search, too. I use Pandora’s and Amazon’s recommendations often to explain why vendors should make it easier for admins and superusers to understand why certain results are suggested; this will make it easier still.
When search must be accountable — and that’s not always, but it is often — transparency is a key capability.
Category: Uncategorized Tags:
by Whit Andrews | June 5, 2014 | Comments Off
Ditching handwriting as a key aspect of curriculae captures attention and excites educators and parents. It’s a watershed change, to move away from teaching writing to pupils at a young age. The New York Times tells me Common Core moves toward keyboard proficiency in elementary school. The article goes on to studiously and decorously lament this, citing credible evidence of the positive effects that good handwriting training offers children.
My own handwriting as a child was atrocious. My mother’s is legendarily idiosyncratic; my son hands me her cards, still, to be deciphered. She typed as part of her work for many decades. My father’s is bold; he never typed for work, and still finds keyboards challenging. We transitioned my son as early as possible to keyboards; his writing invited unfortunate attention from teachers until keyboards replaced it.
But let us not descend into the valley of either-or. Handwriting is hard for some of us. Many a teacher leaned companionably or with intensifying exasperation over my shoulder to tell me to fix my pencil-holding form. (“Imagine a bunny. Do you see its eye? I can’t see the bunny’s eye!”) I hated handwriting worksheets. I despised rote learning. Did the learning harm me? No. On the other hand, my handwriting is now block print, and mostly used for notes and lists.
The key here, as always, is to look at the whole picture. Pew research tells us that 79% of American teachers agree that digital tools such as the Internet and social media “encourage greater collaboration among students.” Those digital tools are more available to people who use keyboards than people who write. That’s obvious. So there’s a benefit from typing.
And the subtext here from penmanship advocates is that we need to take time in the school day to focus on handwriting. But we’re already losing time out of the day for art. The proportion of elementary schools offering visual arts instruction fell in the decade from 2000 to 2010, which won’t come as a surprise to anyone who watched budgets at schools crumple. And cursive is nothing if not art. Calligraphy in any alphabet or ideographic system is, too.
Something that saddens me is the manufacture of debate that seems intended to force a conflict. ”Penmanship was an art!” says a teacher in Education World’s inflammatorily titled “Have Computers Forced Handwriting Out of the Picture?” The article paraphrases her then by noting “All cursive was to be the same; individual styles were not acceptable,” which does not sound like it was taught as an art to me, and certainly reminds me why I hated it so much. (The emotion echoes in me still.)
We don’t need to make everything a point/counterpoint debate. Handwriting triggers aspects of learning that are valuable and meaningful. So does the value of gathering text directly from keyboards. (The folks at “Handwriting Without Tears” also offer “Keyboarding Without Tears.” Which is pragmatic.) And, I think more important, so does visual art education and every other aspect of a learning system. As a technology analyst, I am particularly sorry to see this cast, as so often is the case, as an idyllic past vs. steampunk present debate. It’s not about then vs. now; it’s about understanding now what kind of value we got then, and recasting it for now. Technology conflicts ginned up to pit preservationists against newfangled thinkers get nothing real accomplished.
Category: Uncategorized Tags:
by Whit Andrews | June 3, 2014 | Comments Off
The flinty commercial heart that beats in the body of Silicon Valley is an extraordinary thing, pumping ridiculous sums of money though its global extremities in a tireless rhythm. But even more extraordinary is its guileless, amnesiac brain, glorious in its eager, avid forgetfulness.
Twitter cast up Larry Page’s 2013 founder’s letter upon my cognitive beach today. It’s brief, and likable, and hopeful. In many ways, it ignores Google’s now-forgotten tenets of success, and remembers carefully what Google wants to remember most. In Page’s retelling of the late 1990s as filtered through a citation of the 2004 letter, the firm was founded “to develop services that significantly improve the lives of as many people as possible.” (In my recollection, it was a search engine — and a really, really good one, in a time that lacked them.)
In Jim Edwards’ reading of the letter, a particularly important point is that Google is proud of its spare design. I remember when Yahoo would have said the same, and when adding links to Yahoo’s “front page” was itself news. And Page muses on the importance of longer, more cognitively rich queries. I remember when Google trumpeted that the average query length on the Web was falling, falling, falling, and search engines needed to get with it or get out.
But being callow is one of the things that makes Silicon Valley what it is. You don’t get to say “callow” often enough, and especially not without the word “youth” after it — and Page is showing some gray. There’s a determined disinclination to remember in the Valley, which means Yahoo’s early front page is gone from the collective memory. (Hey, I remember when it went to two columns. I WAS THERE.)
It also means that Google doesn’t remember a time when it thought shorter queries were better. (Before…say…Siri.) And it probably shouldn’t, because too much reflection leads to introspection, and introspection means you stop to think too much about some pretty amazing leaps from one rooftop to the next. You don’t make iPads if you remember Newtons; you don’t make Googles if you remember AltaVistas.
Category: Search Uncategorized Tags: google yahoo memory design
by Whit Andrews | May 30, 2014 | Comments Off
A client called this week to ask how she could know enterprise search is working. The CIO wants to know that the company is truly a leader in its search, and to do that, he feels that there need to be solid measurements he can see that show it’s definitely getting better. Even more attractive would be analytics that show it’s better for his company than it is for competitors.
There are lots of ways to tell. I frequently reread Marti Hearst’s delightful, lovely Search User Interfaces to remind myself of some. There’s a whole CHAPTER on “How To Evaluate Search User Interfaces.” (REALLY. IKR? You are as excited about this as I am. And it’s free online. I read it on paper. Don’t hate me because I’m a Luddite.)
Hunting dogs near my friend Jeff Mann’s home in France. They’re HUNTING. Get it? HUNTING. It’s a RELATED IMAGE.
Also, I ran into this intriguing blog entry today from Jaime Teevan today via twitter. (I wish I knew whom to h/t; I can’t find the tweet again in the barrage of interest around some dude who bought a basketball team.) The idea in a study was to send users on scavenger hunts to find the answers to their questions; if they got the answer right, it meant the search worked. In this case, it was mTurk work on public testing, but it makes sense in organizations as well. Frequently, key stakeholders are upset about particular searches, which could certainly serve as straw men to knock down. A/B testing is too rare in such situations; a 5-question test might be definitive when selecting between a few options for interfaces.
This is handy; I’ll start suggesting it right away.
Category: Uncategorized Tags:
by Whit Andrews | May 29, 2014 | 1 Comment
I had a fun vendor briefing with Brainspace today. Fun, because I’m old now, so it was great to meet a founder who developed his knowledge of search in the Verity era (he built “topics,” for you other grizzled or afflicted folks out there). And fun, because we talked about transparency of relevancy modeling, an important topic in our evaluation of vendors in search, content analytics, and support for knowledge management.
(Flickr user allows this use of his content. Thanks!)
What I mean in the search world by “transparency” is this: Can an administrator, or in an even better case an end user, determine why a particular result set is chosen? It’s a grail, and it’s an asymptote. In the old days, before term frequency/inverse document frequency descended to the Earth and made it possible to rank results meaningfully, “relevancy” mean “hits.” My professor in college showed us that one could filter Moby Dick to see the appearances of a given word or phrase. (“Ah, the world, oh, the whale.” Ah, the DOSfish, indeed.) In such a world, “Moby Dick” is a possible result for “whale” or “Ishmael” or “ship.” No value judgment appeared.
Since then, many such values have emerged, around elements such as term frequency and with great sophistication, including statistical analysis. One vendor was fond of pointing out to me that if the vast majority of users couldn’t understand how it worked, it was also unreasonable (and possibly really mean) to expect that vast majority to understand how it reached a given result set. Another vendor points out that its key advantage is how it works, so revealing that is a bad idea, sort of like Coca Cola writing its formula on the side of the iconic bottle.
Brainspace — not in the current enterprise search search MQ I am putting together, as it does not meet all the criteria — believes in transparency and avoiding the “black box” model of “This is what we gave you; like it or leave it.” They’re still developing a profile in the market, of course.
What transparency allows is for improvement. When information architects can see how results are achieved, they can also improve them more effectively. It’s like the tab in Pandora that allows you to see why certain songs are being played for you, or in Netflix to see why you’re being offered some particular movies. (Currently, I am being recommended “Mad Men,” which I have tried twice…because I liked “Fergo”? Mmm…k.) It’s like the window in the aquarium to which I link above — by seeing what the query engine does, you can put yourself in the place of a fish to understand the dynamics of the tank.
And it’s fundamental. While transparency in other areas can be a buzzword, in search it’s of great value. Without transparency, authority depends only on faith, which fails in information architecture.
Category: Search Tags:
by Whit Andrews | May 27, 2014 | 2 Comments
I met the folks at Vivisimo back in the earliest days of the question whether search would become BI and BI search (jury still out; expect an O.J. case-class delay). Founder and CEO Raul Valdes-Perez was and is a thoughtful, professorial technologist with a gentle mien. Vivisimo is now among the many vendors made history by IBM’s ambition, in its case via acquisition. And Valdes-Perez is back.
OnlyBoth is Raul’s current idea. The startup allows for discovery of intersective interest points in large datasets, and its initial application is as a way to find our what sets an institute of higher education apart. (Duke University, from which I graduated despite myself, is researchy.) I researched a few other schools, and found that WPI, around the corner from where I live, is the only school whose top Master’s major is systems engineering. Which, if my degree weren’t in poli sci and English, might mean more to me. What I do get is that it also coverts these insights into narrative speech — logical, slightly stilted and unusually worded statements. (Such as, “Clark Univ[ersity] is the only college whose top Doctoral major is geography.” I added what’s in brackets; I can’t help it.)
The interesting question, to me, is whether Raul will emphasize enterprise or consumer more in his new offering. I talked to a client today who wanted to know more about whether data mining and search magically intersecting, allowing them to know what they need to know before they know they need it. I told them consumers will have to experience that more often before enterprises can expect to buy it or workers to use it. This is one way that could happen.
Last time, Raul had two startups in one. Vivisimo had a search product called Clusty.com, which clustered search results for the consumer user. (Great product; shaky name.) He went enterprise; Vivisimo itself was a clustering engine that became a search engine that became an analytics engine. Now the choice will be his again — invest the most in building analytics for organizations…or for general consumption? Doing both seems…very intersective.
Category: Uncategorized Tags:
by Whit Andrews | January 12, 2014 | Comments Off
My colleague Adam Preset and I recently mused in research about how, as video gets shorter, organizations must prepare to assemble it in new models — and we chose broccoli and brussels sprouts as our metaphors. Broccoli has branches that branch again and then end; you travel down a broccoli branch and eventually you’ve made choices you can’t back out of easily. Putting lots of little videos together that way is like a configurator — a viewer who makes choices eventually might find the perfect video, but imperfection makes the viewer feel frustrated and trapped in the shrinking fractal prison he grew for himself.
This image brought to you by Steven Lilley, who made it available under the Creative Commons license at flickr.
On the other hand, there’s Brussels sprouts. That’s a somewhat different perspective, where the stalk stretches in one direction, and the sprouts offer brief single diversions off that inexorable progress. We think that model makes more sense for most assemblies of short videos, because as you travel down a definite linear path, the detours you might choose to take don’t ramp you somewhere you can’t get back from. Things feel less final.
This image brought to you by Arnold Gatilao, who made it available under the Creative Commons license at flickr.
I was truly delighted to see last week that The New York Times picked a slight variation on the Brussels sprouts model for a very interesting new news delivery model. The “story” (or video, or whatever) is about high-rise buildings in the city and elsewhere. The story has a linear path, but the viewer can branch out and then return at various designated locations to get more complete or detailed information about particular areas they may be interested in. Perhaps not a Brussels sprout stalk as much as it is cucumber vine, since some branches have more than one segment, but you get the picture. Maybe. (Grape arbor? Work with me.)
The video is delightful. It’s easy and a little quirky, nice matched to its topic (a diagram of it feels a bit like a skyline, not coincidentally) and you can treat it as a sit-back or a lean-forward experience. Oddly, the narration rhymes, although with a rhythm-less spirit that removes it from the realm of the singsong. It’s an excellent model for what informative, and possibly inspirational, video should look and feel like. I imagine it took a significant amount of resources in its conception and possibly also its execution, because it uses some interesting visual tricks like subtle animations of historical imagery. Most organizations could never spend the money that I’m sure it cost more than once a year on executive messaging, but less spiffy efforts are certainly achievable for training and customer service.
How videos will develop is being shaped by consumer behavior daily, and we don’t know how people will react to such art or how that will affect their expectations in their organizations. But consumerization means that such reactions will tell us where to go next.
If you’re a Gartner client, you can find more detail in Enterprise Video Must Be Short, Specific and Searchable to Suit Viewers’ Tastes.
Category: Uncategorized Tags:
by Whit Andrews | January 9, 2014 | 1 Comment
Thanks to NextDraft, a marvelous collection of commented-upon links of the now-venerable “‘zine” format type, I learn in a short bit (Me, Myselfie and I) that people who have narcissistic tendencies tweet more and Facebook more.
As a person who is just starting to blog and tweet again (and who never — literally never — closes the Facebook tab in his browser) this brings me some pause. The study is extensively quoted in the Pacific Standard (with an ad offering me “free Twitter followers,” clearly unnecessary as I have 62 tracking my new work-only account WhitAndrews_) but I am not buying the full version of the document, at $20. I can’t therefore see if the study authors did regression testing on their sample, but if they didn’t, it appears to be sufficiently skewed and self-selected to make it less than credible. (If they did, mea culpa.)
For this throwback Thursday, I decided to see what I had said about Twitter when I was blogging, before. At least once, I captured much of my sense of the service, in “No Twit,” when I said I just didn’t understand it. What I wrote then about not understanding a CEO that chattered about his personal ife still goes, and I still do get it when people use it professionally to provide solid advice and a sort of ongoing stream of professional consciousness. I am interested that my friend Dan Tunkelang still mostly tweets professionally, although the occasional personal detail does drop in.
But the mass of use is so much greater now that the value is significantly higher for me, too. I have a personal account (@whita) that I rarely post to and where I focus on my outdoor activities and pictures, and the work account I noted above. I follow smart thinkers about startups and early stage businesses, which is an area I’m trying to learn, so Twitter is ideal.
Back then, I also noted that Modista was a lovely way to understand the opportunities and challenges of post-literate, post-textual searching. It vanished in some lawsuits, but in a truly delightful throwback surprise, it’s back! More to come on contemporary search models, which promise to be less textual by the day (or the startup). This week brought us Jelly, and I look forward to discovering if users understand how to take advantage of its elegant, simple attempt to reimagine searching with a purpose.
I’m glad that my wording back in 2009 was judicious. I didn’t say Twitter would vanish, although I may have thought it then. (Can’t remember.) I honestly said I didn’t understand it, and I honestly can say the same now, but I’m getting there. I learn best through reading or doing, so I’m not always such a throwback, after all.
Category: Uncategorized Tags:
by Whit Andrews | January 5, 2014 | Comments Off
Gartner used the word consumerization for the first time ever in 1997, long before any of us could have the smartphones that are at the heart of this blog entry. (Wait for it.)
We meant something a little different then from what we mean now: At the time, we were writing about how consumers now were driving the market for printers, through their hunger for more sophistication. Now, we use the term to indicate that “consumer acceptance will…drive enterprise adoption,” to quote the otherwise now far less topical research note “Microsoft’s Indigo Will Advance Web Services,” in which we first used the term as we now mean it. (We published that note in February, 2005.)
Today I read a story that flared how central this has become to our careers in IT. The New York Times story lays out the benefits of the “DJI Phantom 2 Vision Quadcopter,” and of course I want one, pretty much right now, or maybe tomorrow. The drone can hover and take pictures and videos. I cover video content management — the “enterprise YouTube,” speaking of consumerization — and you’d be amazed at how many power companies call, hoping to be able to post video of safe repair procedures. I can see this being a very useful way to capture that kind of video. Another organization, which maintains pipes and waterways, wants to post video of their submerged drones doing diagnostics and repairs. And so on.
You can already think of ways to use this thing, I know, right? Maybe safety and evacuation videos for your physical plants? Investigation of physical property? My law enforcement clients are going to want one for high-speed chases, for investigations of brush fires.
You’ll get it because of the power of the consumer market, of course. If you can afford one — and if you can afford a very good SLR camera, say, then you can afford this (instead). (It’s about $1,200 on Amazon as of now.)
I talked to a client today that was sorting out the value of a product that has, at its heart, a camera and audio recording device. My answer was that consumerization will tear the heart out of that product, as surely as a drone will be featured some time in the next five years in a divorce case. (Hasn’t happened yet. But I’m not the only person to think of the issue. I’m guessing it will be on Law and Order within 10 years.) The Phantom 2 drones use smartphones as control devices; I’m sure that other drones will use them as capture and transmission devices in the near term. (And I’m sure someone handier than me has made it already, anyway.)
This is a pace layer issue, in which any element of a project or device that can be swapped faster will be, and the more likely that the element can be satisfied by a consumer product, the faster that “layer” will renew itself. Today, drone control is renewed very quickly, because it’s a smartphone doing the work. In the future, we’ll see devices like 3D printers used as the output core of larger devices that are used in manufacturing, so you can swap out that layer. Anything that has a consumer market will do the same thing.
Swappable microchips have long serves as the brains of larger systems; now we’ll see toys and phones and printers and other things you can buy in the electronics department be their extremities and senses as well.
POSTSCRIPT: Interesting post on fun stuff at CES, including such drones.
Category: Uncategorized Tags:
by Whit Andrews | January 3, 2014 | 3 Comments
The Facebook conveyor belt is not working all that well these days for me either.
(picture of a museum-resident conveyor belt by Allie_Caulfield, who tagged it as usable for such purposes)
I joined Facebook about 620 friends ago — that’s six years, in your likely measurement. After careful feed management, the creation of some custom friends lists (“Real Friends” and “People Who Like New Profanity” are examples), and the occasional purge, I am now confident in saying I have absolutely no freaking clue why I see what I see.
I could read about it; this is true. I remember when I thought of “social media optimization” and Googled the term to discover that there were, like, three mentions of it on the Internet. (2007?) But I don’t want to read about it; I just want to sit down, look at Facebook, and see it working. I don’t want to learn how the algorithm works, and more important, I really don’t want to have to find another social media network, the way most younger folks seem to. (Yes, I’m on Instagram, but it’s very post-literate, and yes, I’m on Line, but it’s anti-literate, and yes, I know that’s me being old, but guess what? I kinda am.)
My friend Jim Tobin just wrote a fine, crisp blog entry on the issue of the Facebook feed’s freshness. He couldn’t be more right about the feed’s confusing present, including in particular its problem with seeming stale. (Why do these same posts keep coming back? Yes, I know why. It’s a rhetorical question that means make it stop.) I’m also quite clear on how much I pay Facebook, which is nothing.
What stops this from being a rant (well, makes it less of a rant) is the simple fact that this is a major business stress. Facebook is at a point where it needs to find growth in what it sells, not in getting new customers. It has to scale, and scaling is the scariest thing I run into for companies of any size. The feed is standard, and Facebook knows that users like me — who create lists, tweak settings, learn the process — are quite rare. Most of us just accept the default, and responding to what we like is generally excellent business. Follow the data, like Google, and perfection will follow.
Maybe not. Artists of data analysis are difficult to find, and this is a hard place that Facebook finds itself. Twitter has no algorithm, and nor does Instagram, which makes them easier to parse, but they’re different businesses — not virtual kitchen tables, but virtual streetcorners. Facebook needs charisma, and the certainty of its mission. I hope it rights its algorithmic slide. Until then, I suspect I need to make better lists to defeat it.
Category: Uncategorized Tags: