Joining with the Tech

This site is designed to provide you with links to up-to-date news in the field of Computer Science.

Sunday, March 21, 2010

New Teaching Tools Aid Visually Impaired Students in Learning Math

Mastering mathematics can be daunting for many children, but researchers have found that children with visual impairments face disproportionate challenges learning math, and by the time they reach the college level, they are significantly under-represented in science, technology, mathematics and engineering disciplines.

Tuesday, March 9, 2010

2009 ACM A.M. Turing Award Winner Named

ACM has named Charles P. Thacker the recipient of the 2009 ACM A.M. Turing Award for his pioneering design and realization of the Alto, the first modern personal computer, and the prototype for networked personal computers. Alto incorporated bitmap (TV-like) displays which enable modern graphical user interfaces (GUIs), including What You See Is What You Get (WYSIWYG) editors. Thacker's design, which he built while at Xerox PARC (Palo Alto Research Center), reflected a new vision of a self-sufficient, networked computer on every desk, equipped with innovations that are standard in today's models.

Dr. Thacker is also recognized for his contributions to the Ethernet local area network, the "interconnection fabric" that allows multiple digital devices such as workstations, printers, scanners, file servers, and modems to communicate with each other. Today's Ethernets, which are thousands of times faster than the original version, have become the dominant local area networking technology. He also designed the first multiprocessor workstation, and the prototype for today's most used tablet PC, with its capabilities for direct user interaction.

The ACM A.M. Turing Award is ACM's most prestigious technical award. It recognizes contributions of lasting and major technical importance, and honors individuals whose work has advanced the field of computing. First presented in 1966, and named for British mathematician Alan M. Turing, the Turing Award is widely considered to be the "Nobel Prize in Computing."

Wednesday, March 3, 2010

Berlin Brain-Computer Interface

The Berlin Brain-Computer Interface aims to improve the detection and decoding of brain signals acquired by electroencephalogram (EEG). To this end the research focusses on new sensor technology, improved understand of the brain and the analysis of brain waves using modern machine learning methods.

Thursday, February 18, 2010

The Irish Times: Reeling in the hackers

A new study reveals that the popular film portrayal of computer hackers is actually quite accurate, writes KARLIN LILLINGTON

IF YOU don’t like the idea of a scholarly paper on the trail of hackers in films, then take it up with Damian Gordon’s parents. “I have to blame my parents – the only films we were ever taken to were science fiction and futuristic kinds of films,” says Gordon, a lecturer in computer science at the Dublin Institute of Technology.

Gordon has just published his paper, Forty Years of Movie Hacking: Considering the Potential Implications of the Popular Media Representation of Computer Hackers from 1968 to 2008, in the current issue of the International Journal of Internet Technology and Secured Transactions.

A self-confessed film buff, he likes to show students clips from such films as a teaching tool because he feels they bring an abstract subject to life and help initiate lively discussions.

“With computer science you’re always trying to explain complex ideas in a clear way. Clips from films can be very useful for that. Any time I can, I try to slip in a film clip.”

In trying to teach his students about security issues, he realised many had misguided notions about what the typical computer hacker is like and where security threats come from.

That set him thinking that perhaps the misperceptions came from the upper trails of hackers in popular culture.

So Gordon set out to compile a list of as many films that featured hacking as he could and came up with 50 – which he realises is not comprehensive and excludes foreign films, but does pick up most Hollywood films since the late 1960s that fit within his criteria outlined in the 29-page paper. He excluded animated films and documentaries, for example.

He included films from as early as 1968 through to 2008, across several genres from science fiction to crime films.

His paper observes a curious dearth of films in the 1970s, just as computing was coming into popular visibility. His theory is that a lifting of censorship rules caused films to focus more on violence and sex.

“Hacking computers was probably too passive and boring,” he laughs.

The aim of his paper “was really to investigate why there is a general public perception that hackers all seem to be teenagers in bedrooms. Lots of books on hacking talk about this, but it is so wrong. Most hackers are around 30 and are computer professionals.

“Being a hacker is really not about sitting alone in a dark bedroom. It has a lot more to do with your interpersonal skills.”

His film findings surprised Gordon just as much as they might surprise others. Far from having public perceptions of hackers shaped by films, he found that the celluloid portrayal of hackers was actually quite accurate – setting aside the unlikelihood of your average female hacker looking like Sandra Bullock or Angelina Jolie.

“It’s devastating to realise that most movies do portray hackers correctly,” he jokes.

First off, he found that the average age of the majority of film hackers was over 25, with only a quarter younger than that. Some 65 per cent were aged between 25 and 50, and only 3 per cent were older than 50, which he thinks is fairly accurate.

As for profession, 32 per cent were portrayed as working in the computer industry, 28 per cent were full-time hackers, 20 per cent were students and 20 per cent worked in other professions.

Gordon notes that this actually meshes fairly closely with reality – one study cited in his paper notes that the average hacker is 27 and either a computer professional or full-time hacker.

Gordon also found that, in the films, about 10 per cent of the hackers were women, which also approximates real-world statistics.

He notes that for some reason there are far more female hackers portrayed on television compared to film. “I’m presuming that’s because men tend to do the action bits on television,” he says.

Two areas in which film deviated from real-world hacking are the number of attacks depicted as coming from outside an organisation rather than being instigated from those inside an organisation, and the portrayal of the intentions of hackers.

In film, only 20 per cent of the attacks are internal, but industry studies suggest the ratio may be closer to 50-50, Gordon notes in his paper.

Also, the vast majority of hackers in films are actually portrayed as the good guys – a huge 73 per cent, with 10 per cent being somewhere in between, and 17 per cent portrayed as bad guys. “I was definitely surprised at the number of films showing hackers in a positive light,” he says.

However, he rather likes this himself, given that the term “hacker” started out as a positive one, referring to people who were highly adept at tinkering with electronics and writing or modifying computer programs. Only much later did the public start to use the term hacker to mean someone with malicious intent.

“I’d like to reclaim the title as a positive one,” says Gordon.

Damian's top five

Top Millions (1968) Peter Ustinov as Marcus Pendleton, a con-man just out of prison. “Really a great movie.”

Tron (1982) Jeff Bridges as Kevin Flynn, a former employee of fictional computer company ENCOM. “I adored Tron, and you can never go wrong with Jeff Bridges.”

Superman III (1983) Gus Gorman (Richard Pryor) discovers that he has an extraordinary talent for computer programming. “A great salami-slicing attack.”

WarGames (1983) David Lightman (Matthew Broderick) as a high school student who is highly unmotivated at school but is an enthusiastic computer hacker at home. “Fixed in people’s minds the archetype of the young hacker operating from his bedroom.”

Sneakers (1992, Heist) College students Martin Brice (Gary Hershberger) and his friend Cosmo (Jo Marr) use a college computer to hack into banking systems to transfer funds. “Fantastic film”

Friday, February 5, 2010

Making Your Web Colors Visible For All – Five Color Accessibility Tools

If you are designing, or re-designing your web site, it is time well spent running your website through the color accessibility tools below to ensure that your site can be seen correctly by as many people as possible. Most of these tools use WC3 guidelines to perform their various operations. If you can actually read and understand them, the Web Content Accessibility Guidelines (WCAG) 2.0 requires, amongst other things, that there is sufficient contrast between text and background color. For a person with a color disability, the colors used on a web site can mean the difference between being able to read text and images or not. 1 in 12 people have some sort of color deficiency.

Thursday, January 28, 2010

Build An Optimal Scientist, Then Retire

AI researcher Jurgen Schmidhuber says his main scientific ambition 'is to build an optimal scientist, then retire.' The Cognitive Robotics professor has worked on problems including artificial ants and even robots that are taught how to tie shoelaces using reinforcement learning, but he believes algorithms can be written that allow the programming of curiosity itself.

Wednesday, January 27, 2010

Computer science researcher hopes to stall malware threat by tracking human use behaviors

Danfeng Yao, an assistant professor in the computer science department at Virginia Tech's College of Engineering, will use a $530,000 National Science Foundation Faculty Early Career Development (CAREER) grant to develop software that will differentiate human-user computer interaction from that of malware.