iBooks Textbooks & iTunes U Expand Global Reach

John Gruber was asking about how the Textbooks thing was going and–right on cue–Apple announces a big expansion internationally. Great news for education, especially for those of us with overseas courses.

CUPERTINO, California—January 21, 2014—Apple® today announced iBooks® Textbooks and iTunes U® Course Manager are expanding into new markets across Asia, Latin America, Europe and elsewhere around the world. iBooks Textbooks bring Multi-Touch™ textbooks with dynamic, current and interactive content to teachers and students in 51 countries now including Brazil, Italy and Japan; and iTunes U Course Manager, available in 70 countries now including Russia, Thailand and Malaysia, allows educators to create and distribute courses for their own classrooms, or share them publicly, on the iTunes U app.

Source: http://www.apple.com/pr/library/2014/01/21...

How Mighty Software Giants Fall

It's hard to overstate how dominant QuarkXPress was in the publishing world, and in design education anyone suggesting we teach students to use anything else was on a hiding to nothing (I remember similar battles over desktop video editing). There are lessons to be learned from this tale.

As the big dog of desktop publishing in the '80s and '90s, QuarkXPress was synonymous with professional publishing. In fact, it was publishing. But its hurried and steady decline is one of the greatest business failures in modern tech.

Quark's demise is truly the stuff of legend. In fact, the story reads like the fall of any empire: failed battles, growing discontent among the overtaxed masses, hungry and energized foes, hubris, greed, and... uh, CMYK PDFs. What did QuarkXPress do—or fail to do—that saw its complete dominance of desktop publishing wither in less than a decade? In short, it didn’t listen.

Source: http://arstechnica.com/information-technol...

Nintendo Needs To Correct Course

Balanced and thoughtful commentary on the challenge faced by the iconic console giant.

We all knew it was coming, but Nintendo unleashed the bad-news bonanza late last night: It won’t make the 55 billion yen (about $520 million) profit it initially forecasted for this fiscal year, but instead it will lose about 25 billion yen ($240 million) due to weaker than expected sales of pretty much all of its products. It lowered this year’s sales forecast for the Wii U console from 9 million units to 2.8 million.

Source: http://www.wired.com/gamelife/2014/01/nint...

The End Of Film

 "The Wolf Of Wall Street" is Paramount's first digital-only distribution.

Paramount recently notified theater owners that its Will Ferrell comedy "Anchorman 2: The Legend Continues," which opened in December, was the last movie released on 35-mm film, these people said. Previously, only small movies such as documentaries were released solely in digital format.

The decision is likely to encourage other studios to follow suit, accelerating a complete phase-out of film that could come by the end of the year. "It's of huge significance because Paramount is the first studio to make this policy known," said Jan-Christopher Horak, director of the UCLA Film & Television Archive. "For 120 years, film and 35 mm has been the format of choice for theatrical presentations. Now we're seeing the end of that. I'm not shocked that it's happened, but how quickly it has happened."

Source: http://www.latimes.com/entertainment/envel...

Stark Differences In The US & UK Ebook Markets

Fascinating data on the actual prices pain for ebooks, and the best price points from an overall revenue perspective.

From talking to my users, they fall broadly into two categories. First there is the avid readers who buy many books each week; their watchlist is so long that they are happy to buy whichever is cheap today. Then there is the reader who has a particular book in mind; they do not buy very often but when they do, they are not price-sensitive, they just want the book straightaway.

But why the difference between the US and the UK?

Source: http://techcrunch.com/2014/01/15/ten-thing...

The Future Of Education Throughout The Twentieth Century

Some of these are amazing, and not all as ridiculous as they might seem at first glance. At least half a dozen of the entries are almost fully realised in the form of an iPad and a handful of apps.

Every generation has its shiny new technology that's supposed to change education forever. In the 1920s it was radio books. In the 1930s it was television lectures. Here in the second decade of the 21st century, it seems the Massive Online Open Course (MOOC) is the education tech of tomorrow. Let's hope it pans out better than previous attempts.

Today we take a look back at 15 technologies that were supposed to radically change the way that people are educated around the world. Some innovations were mostly hype. Others had an undeniably meaningful impact.

Source: http://paleofuture.gizmodo.com/15-technolo...

Photoshop Gets 3D Printing Features

Because what Photoshop needs is more features.

Adobe also sees Photoshop as an ideal intermediary step between designing 3D models and printing them. The tool can take virtually any 3D model in the standard OBJ, STL, 3DS, Collada and KMZ formats and prepare them for printing. This means adding the necessary scaffolding and rafts so the models can actually be printed, for example. It will also look for potential issues with the model, so users don’t waste a lot of time and material trying to print a model that doesn’t actually work.

Source: http://techcrunch.com/2014/01/15/adobe-par...

Smart Cutlery Aims To Change The Way You Eat

As with most 'smart' devices at the moment, the Hapifork is limited by syncing and charging, as well as our natural ability to ignore good advice from a nagging technological snitch.

After a few meals, I was surprised to find myself being more thoughtful with my eating, placing my fork down between bites. It didn't always stop me from wolfing down dinner after a long day at work, though — I was hungry. When my hunger outweighed my concern for my eating speed, I ignored the Hapifork’s vibrations and its disapproving red light.

I wonder if they'll do the same for chopsticks?

Source: http://www.theverge.com/2014/1/15/5309032/...

Online Education Will Be Better Than The Classroom

Mattan Griffel over at One Month Rails takes on those who underestimate online education:

Imagine a world in which no one person experiences the same class in the same way. One that adjusts a lesson on computer programming depending on whether a student already has previous experience with programming, or is a total beginner – why not use concepts a student may already have to allow them to learn something faster?

Mattan's right about some of this: Computer-mediated learning has huge potential that's currently untapped, and customisation to individual learning styles is only one aspect. He's wrong however in making this an either/or argument, and in characterising classroom education as all linear, one-size-fits-all. There's a great deal that an effective educator can do to accommodate different learning styles, and a great deal of approaches that we can employ in group learning that aren't yet happening effectively online. Both in and out-of-class learning can use collaboration and technology to improve tremendously over the next decade.

Source: http://blog.onemonthrails.com/the-case-for...

The Reaction To Google's Nest Purchase

I've purposely avoided writing about this over the last twenty-four hours, but Nilay Patel's piece for The Verge strikes me as balanced and sane.

The question of trust is perhaps most important of all. Fadell once described the Nest thermostat to me as nothing more than an on / off switch with a lot of nuance — nuance gained by collecting huge amounts of data about your living patterns and energy needs. Adding that data to Google's formidable collection of information about nearly everyone who uses the internet struck immediate fears with privacy advocates and a growing base of skeptics who contend Google's ad-supported business model creates an anti-privacy culture.

I remember a conversation I had about a year ago with a smart person who's a fan of both Apple's products and Google's services. I explained the reasons for my mistrust of Google, but he was convinced that 'regular people' wouldn't ever care what was happening to their data, as long as they got free stuff. He might still be right, but twelve months on I get the sense that more and more people are feeling uneasy about the privacy implications of Google's business model.

Source: http://www.theverge.com/2014/1/14/5307530/...

Siri, Google Now, & The Smartwatch Tread A Path Towards "Her"

The greatest act of undesigning in Her, technologically speaking, comes with the interface used throughout the film. Theo doesn’t touch his computer–in fact, while he has a desktop display at home and at work, neither have a keyboard. Instead, he talks to it. “We decided we didn’t want to have physical contact,” Barrett says. “We wanted it to be natural. Hence the elimination of software keyboards as we know them.”

Again, voice control had benefits simply on the level of moviemaking. A conversation between Theo and Sam, his artificially intelligent OS, is obviously easier for the audience to follow than anything involving taps, gestures, swipes or screens. But the voice-based UI was also a perfect fit for a film trying to explore what a less intrusive, less demanding variety of technology might look like.

My first thoughts on using Siri, a little over two years ago, was that Apple didn't intend it to be perfect. In fact, given the technology of the time—and our imagination of what a true artificial agent might be—it was inevitable that the reality would fall far short. So, while not on the same level a perception/reality mismatch as Newton's handwriting recognition, Siri has come in for her own fair share of satirical brickbats since she first made an appearance on the iPhone 4S. 

The point though was never to introduce a perfect AI interface: Rather it was to help us rethink how we configure, control and interrogate complex technologies. Put simply, Siri's job was to train us to talk to our devices. Viewed as such, I'd call Siri a partial success, and a work-in-progress. I use Siri from time to time: Often when I have the phone in my pocket and my headphones in ("what's the time?", "read my messages", "tell my wife I'm almost home"), sometimes when my hands are busy ("set a timer for three minutes"), and occasionally when the process is just easier than tapping and swiping ("wake me at 7"). Other times I forget she's lurking below the surface of the UI, one accidental extended press away from waking and interrupting my flow.

I'm convinced though that Siri and her descendants will be a big part of our future. I love what the production team on "Her" have in mind for their "slight future".

 

Source: http://www.wired.com/design/2014/01/will-i...

The Beginnings Of Pixar

If you don't already know about how Pixar began life as a high-end computer imaging firm, this is as good an introduction as any:

By the time Ed Catmull graduated from the computer science program at University of Utah, he was already considered a genius and pioneer in his field. He had developed texture mapping, a method for adding detail, texture, and color to a computer 3-D model. In 1972, he used texture mapping to create one of the earliest examples of 3-D computer animation ever – an animated film of his left hand. The one minute clip was eventually bought by a Hollywood producer and used in the 1976 film “Futureworld” – the first full-length film to use computer animation. Today, the clip, entitled “A Computer Animated Hand,” is housed in the Library of Congress after being selected for preservation in 2011 by the National Film Registry.

And if you've not seen it before, that clip from Futureworld is here. Amazing stuff.

Source: http://www.todayifoundout.com/index.php/20...

Downsizing The iPad

Jonathan Seff over at Macworld made the journey from the full-size iPad to the new iPad Mini, and loves it.

In short, the switch from big iPad to little after three and half years has been relatively painless. In most ways, I’ve found the diminutive tablet more portable, easier to hold, and capable of handing what I want from an iPad.

Where Jonathan waited until the retina display came to the smaller form factor, I made the switch to the Mini a year earlier, when it first shipped, and I've now switched full time to the iPad Air. In many ways I agree with Jonathan, and his reasoning is impeccable: Same (or near enough) great display, same battery life, same processor, more portable. The trade-offs were much greater when I opted for the first-generation iPad Mini in 2012, but the portability still shone when compared to the weight and bulk of the iPad 3/4, and this certainly made my switch back to full-size more advantageous.

Nevertheless, given a straight choice between both new iPads I'm still coming down on the side of the Air: It's not significantly heavier than the Mini (at least it feels about the same in the bag I carry), it's a lot easier to hold all day than the old full-size iPads, the screen is a little bit nicer, and typing/reading/navigating UI is a whole bunch easier on the larger screen. That final one is, for me, the clincher—I'm faster and more productive in almost everything I use it for.

What's great of course is that anyone, like Jonathan, who finds the Mini to be more useful doesn't have to make any compromise in terms of performance. That's not a place I expected the iPad line to be by early 2014—let alone late 2013—and it demonstrates just how much of a leap Apple's made in terms of its own silicon over the last year. It's a great time to in the market for an iPad of whichever size suits you best.

 

Source: http://www.macworld.com/article/2086624/le...

Being Frank

Jon Ronson captures the twisted genius of Frank Sidebottom, the greatest papier-mâché pop star ever.

Frank Sidebottom was possibly the strangest pop star in history. Jon Ronson, who played in his band, and has co-written a film inspired by the character starring Michael Fassbender, remembers Frank's creator Chris Sievey as being even more eccentric than his papier-mache alter ego

I met Frank—and Chris—a couple of times, and found him—them—charming and fascinating. We won't see the likes of Frank again anytime soon. Very much looking forward to seeing Frank.

Source: http://www.theguardian.com/culture/2014/ja...

How The News Strips Created Mass Media

Another great book to add to the wish list. The technology of colour newsprint enabled an explosion of creativity every bit as disruptive as those in contemporary times.

Although replete with racial and ethnic stereotypes, the first newspaper comic strips were not so much an extension of vaudeville as precursors of the equally déclassé and temperamentally anti-authoritarian motion picture. The early strips thrived on choreographed violence, including runaway horse carts, baroque streetcar collisions, and a panoply of what Hearst might have termed polychromous explosions.

Source: http://www.nybooks.com/blogs/nyrblog/2013/...

Kalashnikov's Spiritual Pain

Further to this morning's Kalashnikov rifle post.

"I keep having the same unsolved question: if my rifle claimed people's lives, then can it be that I... a Christian and an Orthodox believer, was to blame for their deaths?" he asked.

"The longer I live," he continued, "the more this question drills itself into my brain and the more I wonder why the Lord allowed man to have the devilish desires of envy, greed and aggression".

Source: http://www.bbc.co.uk/news/world-middle-eas...

The Pilcrow, The Interrobang, & The Amphora

I think I've always been fascinated with punctuation and type, but the older I get, the more I love it. 

If, as Kurt Vonnegut believed, the only reason to use a semicolon is to show that you’ve been to college, what does it say when someone uses a pilcrow? Or, for that matter, an interrobang, a manicule, or an octothorpe? While this book doesn’t make any judgments about the punctuation one chooses to use or avoid, Shady Characters takes an entertaining look at the evolution of both common and lesser-known characters.

Shady Characters looks like a great read, and I've added it to my wish list. (Update: It's already available for Kindle, and now purchased.)

Source: http://www.weeklystandard.com/articles/per...

Good Design That Kills

Not all design is a force for good.

Mikhail Kalashnikov didn’t get rich from the huge success of his rifle, since he didn’t hold a patent. (That sort of thing was simply not done by citizens who labored for Soviet socialism.) He has been quoted saying that he would have preferred to create a good lawnmower, and blamed the Nazis for making him invent a weapon.

Source: http://observatory.designobserver.com/feat...

Smart TVs Are Still Dumb

JLG bemoans the lack of intelligence in even so-called Smart TVs, and wonders how we're ever going to get to the an "Internet Of Things" for normal people.

If an appliance would yield its control and reporting data, an app developer could build a “control center” that would summarize and manage your networked devices. But in the Consumer IoT world, we’re still very far from this desirable state of affairs. A TV can’t even tell a smartphone app if it’s on, what channel it’s tuned to, or which devices is feeding it content. For programmable remotes, it’s easy to get lost as too many TVs don’t even know a command such as Input 2, they only know Next Input. If a human changes the input by walking to the device and pushing a button, the remote is lost. (To say nothing of TVs that don’t have separate On and Off commands, only an On/Off toggle, with the danger of getting out of sync – and no way for the TV to talk back and describe its state…)

This is why, to all intents and purposes, I've given up on the programmable Bluetooth-IR blaster I bought a year or so ago (though terrible battery life was also a factor). It's also a reason why so many people still hope Apple will ship a complete TV system, with all the vertical integration that it can bring to the markets it enters. For the record, I don't think we'll get an Apple display that functions as a TV anytime soon, and I'd much sooner have something like the existing Apple TV box that I can upgrade every three years or so (and via software in the interim). I wonder whether one route might be a higher end option that functions as an AV Receiver/HDMI switch/Airplay bridge, and routes other inputs to the TV screen (kind of like the Xbox One does), though that doesn't seem a very Apple-like move to me. Right now I'd be happy with an App platform for the existing Apple TV: I'm watching less and less stuff that doesn't come via that little £99 black box as time goes by.

Source: http://www.mondaynote.com/2014/01/12/inter...