Has Apple Become What it Once Railed Against?

The above video is Apple’s (in)famous advert announcing the launch of the first Macintosh computer, which would compete with IBM’s latest offerings. At that time, Steve Jobs, ever the drama queen, expressed that being outsold by IBM would lead to a “Dark Ages” of technology, and that Apple – as the tacit technological heroes – must come in and save the day. With the Mac due to launch in January 1984, the advert was portraying the masses blindly going along with other computers, and the individuality expressed by Apple saving the day – and thus saving the world from a technological Dark Ages.

The irony though is that Jobs has always been bent on controlling Apple computers and how the user interacts with them. So intent was he on moving towards point-and-click with a mouse that he instructed the original Macintosh to have no arrow cursor keys on the keyboard, thus forcing users to interact with a mouse even if they didn’t want to. When asked if he wanted to conduct market research into what the public wanted, he responded that he didn’t because the consumers “don’t know what they want until we show them.” Far from being the antidote to the Big Brother of 1984, then, Jobs was shaping his company to be exactly that by controlling how the user interacted with the machine. This control went right down to such levels as designing the computer’s case in such a way that only Apple engineers could open it, removing the possibility for users to open it up to look inside.

Fast-forward to today and Apple’s portable devices – the iPhone, iPod and iPad – still adhere to this philosophy. In some ways it’s beneficial – by controlling the hardware and software Apple is able to produce a seamless user experience. The downside, however, is that the user is severely limited in how they use the devices. This means that, by Jobs’s own acknowledgement, the portable devices all look and act exactly the same. And the focus on simplicity has meant that many features have been stripped from the products. The iPhone, for example, was extremely limited in its first release, to the point that it lacked 3G connectivity, GPS and bluetooth transfer capability. Although it has progressed a lot since 2007, it hasn’t introduced anything revolutionary since its inception and indeed still lacks many features offered in other devices. Its uniqueness now lies mainly in its number of applications available – yet many of these apps are merely making up for the lacking native functionality.

Apple workers were quick to talk of Steve Jobs’s “reality distortion”, which essentially means his ability to twist and distort facts to suit his ideas, and persuade others to believe him. This is something the public has been able to witness in his public keynotes, and it allowed him to sell products that lacked many features as “revolutionary” and “amazing” – buzz words he used for all his products to convince the public that that’s exactly what they were.

While the “1984” advert has gone down in history as one of the greatest advertisements ever made, there is the wondering of whether Apple should have been the company to release it. After all, what’s more Big Brother than restricting something to such a degree that users are forced to adapt how they use it? While cursor keys have made it back to Apple keyboards, the ethos is still present. In iTunes, purchased items can only be played on Apple’s software – forget trying to put the iTunes film you purchased onto any other device than an Apple one. If you have an iPad, thanks to it missing the industry-standard of a micro-USB port, if you want to connect a digital camera to place your photos onto it, you need to spend more money purchasing Apple’s own cables to allow you to do so.

But to what degree does this “walled garden” approach genuinely benefit the user? Certainly, a good experience is given to the user by Apple managing its own hardware and software, so it can ensure that the experience is seamless. But it’s likely a stretch of the truth to suggest that micro-USB ports cannot be included without ruining the user experience, or giving a degree of customisation to the iPhone and iPad will ruin the experience. It seems instead that the ideology was born from Jobs’s desire to control – something that he never tried to hide, and Walter Isaacson’s biography of the man explains in great detail the lengths to which Jobs would go to ensure control of his products.

With Jobs’s untimely death and a new CEO at the helm of Apple, it will be interesting to see if more flexibility becomes of the iDevices.

Nokia Vibrating Tattoo – Never Miss Another Call

Nokia has filed a patent for a ferromagnetic material that can be applied to people’s skin, either sprayed, stamped or tattooed in the traditional way. When the phone gets a notification – SMS, email, phone call etc – the tattoo will vibrate alerting you to it.

The premise is explained in more detail at the Nokia Connects website:

 

The ferromagnetic material vibrates in a multitude of ways when a message, phone call, low battery indication or several other alerts are received by the tattoo from a Nokia phone. The magnetic field can cause of multitude of different vibrations!

The magnetic mark can however remain invisible, making it more appealing to people who don’t want to visibly mark their body with a tattoo. I know I’m not a tattoo person so this would appeal to me, but each to their own! Wait, there is more….

The tattoo also acts as a form of identity. Yes, could this be the end of passports?! Maybe not, but it does mean that I would no longer need to type a password into my phone. I AM THE PASSWORD!! If I happened to be walking around blissfully unaware that my pocket was flapping open on a busy street and someone saw the opportunity to steal my phone, I would be over the moon that they couldn’t access it without a small marking on my arm!

We will let you know soon once/if the patent application is successful. Fingers crossed everyone, until then don’t leave your pockets flapping open.

George Clooney Arrested at Sudan Embassy During a Protest

Hollywood superstar George Clooney and his father were both arrested today during a protest that accused the president of Sudan, Omar al-Bashir, of blocking food and aid from entering the Nuba Mountains and initiating a humanitarian crisis. Also arrested were Democratic US Republican Jim Moran, of Virginia, Martin Luther King III and NAACP President Ben Jealous. The arrest was made after they received three warnings not to cross a police line set up outside the embassy, and followed Clooney’s meeting with President Obama, testimony in the Senate and attendance at a state dinner held for Prime Minister David Cameron.

Speaking before the arrest, 50-year-old Clooney explained to The Associated Press that he was impressed with Obama’s engagement on the Sudanese topic and was hoping to bring people’s attention to the crisis currently unfolding in Sudan. He also said that for international leaders to expose corruption, they need to “follow the money” that is reaching the leaders of Sudan. “This is a moment where we have a chance to do something because if we don’t, in the next three to four months, there’s going to be a real humanitarian disaster.”

He also acknowledged the uphill struggle facing all those protesting, saying that “Its such a silly thought to think you’re actually succeeding in any of this. But if it’s loud enough…at the very least people will know about it, and you can’t say we didn’t know. That’s the first step.”

 

 

Nokia: Working on Revolutionary Phone and a Tablet

According to an article from the Finnish newspaper Kauppalehti Optio, Nokia’s design head Marko Ahtisaari and his team are following up the award-winning design of the N9 with a phone that will revolutionise the user experience of phones. No clues have been given so far other than the user will not need to bend down and physically push their finger on the screen, which is intriguing. Time will tell whether this will be a natural experience like the swipe motion employed by the N9 and, to some extent, Windows Phone.

Ahtisaari stated in the interview that Apple’s offerings, the iPhone and iPad, resemble a poorly designed home because to go from one room to another requires going through the front door – in other words, to leave one place and open anywhere else requires pushing the home button. The user experience was adjusted with the release of the N9, which required only a swipe to leave one place and be back at the home screen, but whether this new phone will continue with this theme remains to be seen.

Ahtisaari states in the same article that he is spending about a third of his time working on a Nokia tablet – presumably running Windows Phone, but who knows, we could see a MeeGo comeback yet, or even the astounding bendy “kinetic” phone displayed last year, where interactions are all done through tilts and bends. Whatever route they take, it should be one to watch with CEO Stephen Elop repeatedly stating that as the iPad is the only successful tablet on the market, Nokia will not be entering the fray until it has a product that can do well. For this to happen, it would either need to be completely revolutionary or run Windows 8, clearly and explicitly telling the consumer market why that is beneficial. After all, with the purpose of a tablet being to have a more portable laptop, a well-placed Windows product could clean up being the only tablet to offer full desktop usability. As the iPad is really a glorified iPhone with the same crippled multitasking, and the Android offerings are largely painful to use, a new product that gives consumers the ability to do everything they would on their home PC but on the go could be exactly what a tablet is all about. And if Nokia used the concept design in this post’s image, it would certainly stand out from the crowd.

 

 

 

Will the Hunger Games Premiere Match the Hype?

The Hunger Games Premiere is hitting cinemas on the 23rd of this month, and with the huge success of the books the question on everyone’s minds is, Will it live up to expectations?

So big are the books that aside from spending 160 weeks on the New York Times Bestseller List they have sold in excess of 23 million copies in America alone. The Hunger Games are those rare books that, like Twilight and Harry Potter, permeate into society’s conscious mind and rest there until Hollywood decides to spin a profit with films.

While for the legions of fans that the Young Adult trilogy has racked up will consider the film a no-brainer, others have noticed that the storyline seems to have taken influence from various other stories – Stephen King’s The Running Man and Long Walk spring to mind, with the former taking place in the future where the government is in full control of what is seen and heard in the media, and the top entertainment being a reality gameshow where people have to outrun killers while the public place bets; the latter is a national sporting event for 100 teenage boys to embark on an treacherous walk in the author’s vision of a totalitarian USA, and if they break certain rules or receive three verbal warnings (walking under 4mph is one violation) they are shot. Then there’s Battle Royale, a story where the Japanese government captures a year-9 class and, under the Battle Royale Act, forces them to kill each other. Each of these stories has something that the Hunger Games seems to have borrowed, but the author, Suzanne Collins, insists she got the idea by flicking between real-life war coverage and a reality TV show. However they came to be, these books were released in a saturated genre and became huge sellers, so it’s the film to be concerned with now.

While there are well-known people in the first film to be released, such as Woody Harrelson, Lenny Kravitz, Stanley Tucci and Donald Sutherland, they will be playing important but not primary characters. The main characters are being portrayed by unknowns, so is this a false move or a touch of genius? The Hunger Games will undoubtedly have no trouble at the box office regardless of who is starring in it, and everyone has expectations and opinions of actors that are already known. By going with yet-to-be-big actors, the cast can work solely on making the film as good as possible, rather than wondering how each actor’s fans will react, but fans could have concerns that unseasoned actors may not be up to the task – but at least moviegoers can be grateful that Kristen Stewart won’t be in yet another film. Instead, the Hunger Games will feature Jennifer Lawrence as Katniss and Josh Hutcherson as Peeta Mellark. How well they will do can be decided in less than 2 weeks, and it’s a safe bet that if they do a bad job, there are millions of people who will be more than happy to say as much.

If you’ve already seen the Hunger Games trailer and can’t wait for the big release, we have some behind-the-scenes footage of the upcoming Hunger Games film below:

 

What are your expectations of the film? Let us know in the comments below.

Ethicists Propose After-birth Abortion

Opinions on abortion are still divided and the topic causes heated debates from time to time, not least during the Presidential campaigns when hopeful candidates speak of their personal outlooks. Yet if the termination of an unborn child with no consciousness is not divisive enough, two ethicists working with Australian universities claim in the Journal of Medical Ethics that “after-birth abortion” should be permissible from an ethical standpoint.

After-birth abortion, once the name has been peeled back, simply means murder, although the two ethicists in question, Alberto Giubilini and Francesca Minerva, prefer the term to murder or infanticide because it emphasises “that the moral status of the individual killed is comparable with that of a fetus”. Also rejected is the term euthanasia, because the reason for the killing may not be because of the child’s best interests, but those of the parents.

Part of the controversy regarding abortion is deciding at what point the termination should be allowed, with current rulings settling at 24 weeks. After-birth abortion would necessitate extremely grueling, confusing and rigorous rules to determine an acceptable case, and Giunilini and Minerva state that it will be acceptable in such instances as putting the well-being or life of the family at risk, and consider Downs Syndrome as a good example because “such children might be an unbearable burden on the family and on society as a whole, when the state economically provides for their care.” Ultimately this would mean that any newborn that puts a psychological, social or economical burden on parent or society could be subjected to an after-birth abortion. The potential risk should this ever become law is setting the stage for eugenics, where, hypothetically, new criteria could be set for an ‘acceptable’ human being and anything less than that would be considered a burden on the family or society. This would be less likely if the decision relied solely on the parents, but if societal burdens were brought into the equation then the possibility of state interference could not be ruled out.

According to the authors, after-birth abortion is morally acceptable because newborn babies are not people in the “morally relevant sense” but instead are “potential persons” because to be considered a person, in their opinion, means being “an individual who is capable of attributing to her own existence some (at least) basic value such that being deprived of this existence represents a loss to her.” However, this viewpoint does not seem to touch upon how a child with perfect mental capacity – that would understand its existence – but a physical condition that would burden the family or society would fit into the suggestion of after-birth abortion. Essentially, Giubilini and Minerva are asserting that, from an ethical standpoint, newborn children should not be considered actual persons anymore than a 23-week-old fetus is, despite the state of consciousness that a born child has. This is highlighted in their defence of after-birth abortion that “merely potential people cannot be harmed by not being brought into existence,” although they make no attempt to define at what age someone is considered an “actual” person.

For many, this idea would seem abhorrent. Yet there is the case of at least one woman that may confuse the issue because she wishes her son had never been born. Not for reasons of not loving her child, but because his condition will not only kill him in the near future and causes intense suffering for the child and his family while he is alive. In other words, this is the sort of scenario Giubilini and Minerva were likely thinking of in their paper.

Emily Rapp is the woman in question, and her son Ronan, who is nearly two, suffers from the progressive genetic disorder Tay-Sachs disease. Although still alive, Ronan is paralysed and blind as a result of the disease. His mother says that had she been aware during her pregnancy that her son would suffer daily seizures and be paralysed to such a degree that he cannot even swallow, she would have saved him the pain and suffering and opted for an abortion – but his condition went undetected. Emily Rapp stressed that while she would have had an abortion, it “would have been a different kind of loss to mourn and would by no means have been a cavalier or uncomplicated, heartless decision.” She also goes to great lengths to ensure people know her words are not borne out of a lack of love for her son, but rather her love for him is so great she wants to spare him the pain – to the point that she would live without him: “I’m so grateful that Ronan is my child. I also wish he’d never been born; no person should suffer in this way…with no hope for a cure. Both of these statements are categorically true; neither one is mutually exclusive…I love Ronan, and I believe it would have been an act of love to abort him, knowing that his life would be primarily one of intense suffering, knowing that his neurologically devastated brain made true quality of life…impossible.”

It goes without saying of course that wishing you had undergone an abortion in hindsight and killing a child you can physically hold in your arms are not the same thing, but does a real-life example of a parent who sees the suffering in her child’s life and a degree of kindness in termination blur the lines of morality enough to make after-birth abortion an acceptable idea? Or is it the case that it opens too many possibilities for abuse; that people suffer at any age and we need to just accept that is how life is?