Posts Tagged ‘Watson’

Watson Goes To Medical School

October 31st, 2012 10:20 admin View Comments


First time accepted submitter Kwyj1b0 writes “I.B.M’s Watson is headed to the Cleavland Clinic Lerner College of Medicine of Case Western Reserve University for training. Clinicians and students will answer and correct Watson’s questions, in an attempt to crowdsource its education. From the article: ‘“Hopefully, we can contribute to the training of this technology,” said Dr. James K. Stoller, chairman of the Education Institute at Cleveland Clinic. The goal, he added, was for Watson to become a “very smart assistant.” Part of Watson’s training will be to feed it test questions from the United States Medical Licensing Exam, which every human student must pass to become a practicing physician. The benefit for Watson should be to have a difficult but measurable set of questions on which to measure the progress of its machine-learning technology.’”

Source: Watson Goes To Medical School

Will IBM’s Watson Kill Your Career?

June 8th, 2012 06:57 admin View Comments


Nerval’s Lobster writes “IBM’s Watson made major headlines last year when it trounced its human rivals on Jeopardy. But Watson isn’t just sitting around spinning trivia questions to stump the champs: IBM is working hard on taking it into a series of vertical markets such as healthcare, contact management and financial services to see if the system can be used for diagnosing diseases and catching market trends. Does this spell the end for certain careers? Not really, but it does raise some interesting thoughts and issues.”

Source: Will IBM’s Watson Kill Your Career?

Computer Programming for All: A New Standard of Literacy

May 17th, 2012 05:30 admin View Comments

Everyone ought to be able to read and write; few people within the global mainstream would argue with that statement. But should everyone be able to program computers? The question is becoming critically important as digital technology plays an ever more central role in daily life. The movement to make code literacy a basic tenet of education is gaining momentum, and its success or failure will have a huge impact on our society.

The democratization of literacy in the late 19th century created one of the great inflection points in human history. Knowledge was no longer confined to an elite class, and influence began to spread throughout all levels of society. Any educated person could command the power of words.

What if any educated person had equal sway over the power of machines? What if we were to expand our notion of literacy to encompass not only human languages but also machine languages? Could widespread facility in reading and writing code come to be as critical to society as the ability to manipulate spoken and written language?

The usual definition of computer literacy stops at the UI: If a user knows how to make the machine work, he or she is computer-literate. But, of course, the deeper literacy of the programmer is far more powerful. Fortunately, computer languages and human languages are basically very similar. Like human languages, computer languages vary in form and character (Python to Java to Ruby) and can be implemented in infinite ways. My Python may not look like your Python, but it can do the same thing; likewise, a single idea can be expressed using a variety of combinations of English words. And both kinds of language are infinitely flexible. Just as a person literate in English can compose everything from a sonnet to a statute, a person literate in programming languages can automate repetitive tasks, saving time for things only a human can do; distribute access to systems of communication and control to large groups of people; and train machines to do things they’ve never done before. Computer programming already does marvelous things like deliver this article to your mind, operate life-sustaining medical devices and enable IBM’s Watson to win at Jeopardy. The current potential for innovation would be many times greater if every schoolchild had a firm grasp of programming concepts and how to apply them.

Among programmers, a movement is forming around this idea. Shereef Bishay, founder of San Francisco-based Developer Bootcamp, believes that coding is destined to become a new form of widespread literacy within the next 20 to 30 years. Everybody should learn to code, he says, because machine/human and machine/machine interaction is becoming as ubiquitous as human/human interaction. Those who don’t know how to code soon will be in the same position as those who couldn’t read or write 200 years ago.

300 years ago, Bishay said, “you would have to hire to write a letter for you, and hire them to read the letter for you. It is just insane.” Today most people hire a skilled programmer to write computer programs for them.

The code literacy movement began to gather steam in late 2011, when Codecademy started teaching basic programming skills for free. The debate came to a head this week as two blog posts took the top spots on the tech website Hacker News. The first, dubbed “Please Don’t Learn to Code,” came from noted developer and creator Jeff Atwood on his blog Coding Horror. The second, a rebuttal entitled “Please Learn to Code,” came from Sacha Greif, a Parisian designer whose clients include HipMunk and MileWise. 

“I do think (or at least, hope) that computer programming will become the next version of literacy,” Greif wrote in an email to ReadWriteWeb. “When I watch my 4 year old niece interact with an iPhone, I see her intuitively using interaction patterns that older people often have trouble with, even when they’re computer-literate. And kids can easily memorize huge quantities of facts about complex abstract systems like Pokemon games. So clearly they have the potential to learn how to code.”

Not everyone in the programming community agrees. Atwood argues that verbal literacy is a different kind of skill, and more fundamental. “Literacy is the new literacy,” he told ReadWriteWeb. “As much as I love code, if my fellow programmers could communicate with other human beings one-tenth as well as they communicate with their interpreters and compilers, they’d have vastly more successful careers.”

Atwood stresses learning, and mastering, the basic skills of communication. Learn to read. Learn to write. Learn to hold a conversation. Learn some basic math. These skills, he says, are more essential than being able to program a computer.

Of course, the path to universal code literacy is not without roadblocks. The skills necessary depend on how computing evolves over the next several decades. How will quantum computing affect our relationship with computers? However, the human capacity to learn is not at issue. If it becomes necessary for me to code to interact with my machine, I will likely learn to code. It is no different than if I was dropped off in Cambodia without a place to stay or food to eat – I’d learn the local language posthaste. 

At present, the ability to program computers is vocational, like carpentry or learning to cook. There’s little impetus to make it universal. But imagine if it were.

Should computer programming become the new literacy? Or should it remain a vocation? Let us know in the comments. 

Images courtesy of Shutterstock


Source: Computer Programming for All: A New Standard of Literacy

Double-Helix Model of DNA Paper Published 59 Years Ago

April 2nd, 2012 04:40 admin View Comments


pigrabbitbear writes with musings on the anniversary of the groundbreaking paper on DNA structure by Watson and Crick. From the article: “Consider every organism that’s ever lived on Earth. From dinosaurs to bacteria, the number is near infinite, and an overwhelming majority have their entire structures and lives dictated according to their DNA. The DNA molecule is life itself, and it’s astonishing that we’ve only known what it looks like for less than a century. But it’s true: In one of the most groundbreaking papers ever published, James D. Watson and Francis Crick described the double-helix structure of DNA in Nature, 59 years ago today.”

Source: Double-Helix Model of DNA Paper Published 59 Years Ago

IBM Watson To Battle Patent Trolls

December 8th, 2011 12:07 admin View Comments


MrSeb writes “IBM’s Watson is made of many parts: speech recognition, natural language processing, machine learning, and data mining. All of these factors were perfectly combined to beat Ken Jennings in Jeopardy, and now each of these components are slowly finding their way into other applications. Health plan company WellPoint, for example, is using Watson to investigate patient records to improve diagnosis, and in a self-referential, possibly universe-destroying twist, IBM itself is using Watson to help sell Watson (and other IBM products) to other companies. Now, using Watson’s data mining and natural language talents, IBM has created the Strategic IP Insight Platform, or SIIP, a tool that has already scanned millions of medical patents and journals for the sake of improving drug discovery — and in the future, it’s easy to see how the same tool could be used to battle patent trolling, too.”

Source: IBM Watson To Battle Patent Trolls

Amazon EC2 Now #42 Supercomputer, IBM BlueGenes in the Dust

November 15th, 2011 11:11 admin View Comments

SPARC64 (Fujitsu, 150 sq).jpgThe question among both cloud computing consumers and supercomputer clients alike has been when the distinction between the little cloud and the big iron would disappear. Apparently that boundary evaporated several months ago. In its twice-annual survey of big computer power, the University of Mannheim has reported that Amazon’s EC2 Compute Cluster – the same one you and I can rent space and time on today – performs well enough to be ranked #42 among the world’s Top 500 supercomputers.

How far down is #42? In terms of time, not far at all. When EC2 was but a gleam in Jeff Bezos’ eye, Los Alamos National Laboratory’s BlueGene/L was king. Now, the 212,992-core beast ranks #22. Roadrunner, the amazing hybrid made up of 122,400 of both IBM Power and AMD Opteron cores, sits in #11. Meanwhile, EC2 – whose makeup is a little of this and a little of that – has achieved #42 status with only 17,024 cores.

Twice each year, the rankings of 500 of the world’s supercomputers are assessed by the University of Mannheim in association with Berkeley National Laboratory and the University of Tennessee, Knoxville. Those assessments use the industry standard Linpack benchmark, which measures floating-point performance. Supercomputers’ scores are sorted by tested clusters’ maximal observed peak performance, in gigaflops (GFlops, or billions of floating-point operations per second). This performance is called the “Rmax rating,” and computers are ranked on the Top 500 list according to this score. For comparison, Mannheim does publish theoretical peak performance (“Rpeak“), representing how fast each system’s architects believe it could or should perform. Dividing Rmax by Rpeak produces a yield ranking, which represents how well each system is performing to engineers’ expectations.

111115 Japan K computer.jpg

EC2′s yield is not particularly great, just a 67.8. By comparison, the winner and still champion in the November Top 2011 list belongs to a machine simply called “K” (shown above), assembled for the RIKEN Advanced Institute for Computational Science in Kobe, Japan. Its yield is an astonishing 93.17. Cloud architectures are not known for their processor efficiency; oftentimes they’re “Frankenstein” machines cobbled together with available parts, but marshaled by a strong, nimble, and adaptive cloud OS.

The Linpack Rmax score for EC2 topped out at 240,090 – almost a quarter of a petaflop. LANL’s Roadrunner was declared to have broken the one petaflop barrier (one thousand trillion floating-point operations per second) in June 2008. Japan’s “K” has now shattered the 10 petaflop barrier with an Rmax score of 10,510,000.

At this rate, EC2 actually may not catch up with the top 20; the rate at which the world’s supercomputers are improving in both speed and efficiency outpaces cloud clusters. What’s interesting about the latest turns of events in the November 2011 rankings is how processors made for supercomputers are outpacing clusters made with commercial off-the-shelf (COTS) processors like Intel Xeon, AMD Opteron, and IBM Power. “K,” for example, is made up of 352,512 dual-core Fujitsu SPARC64 chips, which run two threads in parallel per core, and which include an arithmetic unit separate from the instruction control unit in each core. They’re made for Fujitsu mainframes.

Faster optical interconnects between the chips, as opposed to commercial sockets, also account for huge speed gains. The first leap forward in interconnect technology took place a quarter-century ago, when computer designers began using four-dimensional mapping to link nodes to other nodes. The theoretical shape formed was a “tesseract,” and the theoretical number of feasible connections – a term popularized by the late Dr. Carl Sagan – was “googleplex.” That’s the door from which the word “google” entered our common vernacular.

Fujitsu’s architecture for “K” is based on a theoretical six-dimensional torus, which reduces the hop count for processes between nodes by half or more, and which enables as many as 12 fault-tolerance failovers per node. That’s a feature cloud architects might want to take into account; the failover features that cloud operating systems like OpenStack retrofit into conventional clusters by force, supercomputers implement by design.

Although IBM made headlines last year by demonstrating how Watson could win at “Jeopardy!” the 16 BlueGene architecture machines still on the current list are sliding. The Jülich Supercomputer Center’s JUGENE system that took the lead two years ago has dropped to #13, with only #17, #22, and #23 now in the top 25.

Once shut out of the list entirely, perhaps the most historically significant name in supercomputing history is making a supreme comeback. Cray is now responsible for having constructed 27 of the clusters on the latest list, including the #3 Jaguar at Oak Ridge National Laboratories (Rmax score: 1,759,000) plus #6, #8, #11, #12, #19, and #20. Cray’s surge also represents a boon for AMD, since Cray uses Opteron CPUs exclusively.

The United States still maintains 263 of the Top 500, with Jaguar being the fastest. But China is surging forward too with 74 clusters, Japan with 30, and South Korea with 3, its fastest being a Cray at #31.

For those of you at home still keeping score, Windows is almost out of the picture entirely. It powers only the #58 supercomputer, and that one was built in 2008: a 30,720-core Opteron cluster operated by the Shanghai Supercomputing Center. BSD powers the #94 machine, and Unix powers 30 systems. The rest belong to Linux. Most of those clusters, by the way, use IBM Power processors. Some 49 of the Top 500 run on Power chips, while AMDs power 63 clusters. Intel has the lion’s share with 384 clusters. Among them, 239 use the latest Core architecture, 141 the earlier x86 architecture, and only 4 now are on Itanium.

Source: Amazon EC2 Now #42 Supercomputer, IBM BlueGenes in the Dust

IBM Eyes Brain-Like Computing

October 13th, 2011 10:18 admin View Comments


schliz writes “IBM’s research director John E Kelly III has delivered an academic lecture in Australia, outlining its ‘cognitive computing’ aims and efforts as the 70-year ‘programmable computing’ era comes to a close. He spoke about Watson — the ‘glimpse’ of cognitive computing that beat humans at Jeopardy! in Feb — and efforts to mimic biological neurons and synapses on ‘multi-state’ chips. Computers that function like brains, he said, are needed to make sense of exascale data centers of the future.”

Source: IBM Eyes Brain-Like Computing

IBM’s Watson To Help Diagnose, Treat Cancer

September 12th, 2011 09:43 admin View Comments


Lucas123 writes “IBM’s Jeopardy-playing supercomputer, Watson, will be turning its data compiling engine toward helping oncologists diagnose and treat cancer. According to IBM, the computer is being assembled in the Richmond, Va. data center of WellPoint, the country’s largest Blue Cross, Blue Shield-based healthcare company. Physicians will be able to input a patient’s symptoms and Watson will use data from a patient’s electronic health record, insurance claims data, and worldwide clinical research to come up with both a diagnosis and treatment based on evidence-based medicine. ‘If you think about the power of [combining] all our information along with all that comparative research and medical knowledge… that’s what really creates this game changing capability for healthcare,’ said Lori Beer, executive vice president of Enterprise Business Services at WellPoint.”

Source: IBM’s Watson To Help Diagnose, Treat Cancer

Microsoft Pursues WebOS Devs, Offers Free Phones

August 22nd, 2011 08:40 admin View Comments


CWmike writes “Taking advantage of Hewlett-Packard’s departure from the tablet and smartphone market, Microsoft has offered webOS developers free phones, tools and training to create apps for Windows Phone 7. Brandon Watson, Microsoft’s senior director of Windows Phone 7 development, made the offer on Twitter on Friday, and has been fielding queries ever since. ‘To Any Published WebOS Devs: We’ll give you what you need to be successful on #WindowsPhone, incl. free phones, dev tools, and training, etc.,’ Watson said a day after HP’s announcement. Before Friday was out, Watson said he had received more than 500 emails from interested developers, and later, that the count was closing in on 600.”

Source: Microsoft Pursues WebOS Devs, Offers Free Phones

IBM Watson To Replace Salespeople and Cold-Callers

July 6th, 2011 07:08 admin View Comments


An anonymous reader writes “After conquering Jeopardy! and making inroads into the diagnosis of medical maladies, IBM’s next application for Watson is improving sales and customer support. Companies will be able to simply fill Watson (or rather, DeepQA) with domain-specific information about products and services, and sit back as it uses its natural language processing skills to answer the queries of potential customers. The potential benefits are huge. Watson could either augment existing sales and support teams, or replace them entirely. Also, in a beautiful and self-fulfilling twist, the first application of this re-purposed Watson will be be internally, at IBM, to help sell more IBM Watsons to other companies.”

Source: IBM Watson To Replace Salespeople and Cold-Callers