- published: 03 Apr 2017
- views: 17417
Geoffrey Everest Hinton FRS (born 6 December 1947) is a British-born cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. As of 2015 he divides his time working for Google and University of Toronto. He was one of the first researchers who demonstrated the use of generalized backpropagation algorithm for training multi-layer neural nets and is an important figure in the deep learning movement.
Hinton was educated at King's College, Cambridge graduating in 1970, with a Bachelor of Arts in experimental psychology. He continued his study at the University of Edinburgh where he was awarded a PhD in artificial intelligence in 1977 for research supervised by H. Christopher Longuet-Higgins.
He has worked at Sussex, University of California San Diego, Cambridge, Carnegie Mellon University and University College London. He was the founding director of the Gatsby Computational Neuroscience Unit at University College London, and is currently a professor in the computer science department at the University of Toronto. He holds a Canada Research Chair in Machine Learning. He is the director of the program on "Neural Computation and Adaptive Perception" which is funded by the Canadian Institute for Advanced Research. Hinton taught a free online course on Neural Networks on the education platform Coursera in 2012. Hinton joined Google in March 2013 when his company, DNNresearch Inc, was acquired. He is planning to "divide his time between his university research and his work at Google".
Hinton may refer to:
In Australia:
In Canada:
In England:
In the United States:
Deep learning (deep structured learning, hierarchical learning or deep machine learning) is a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers with complex structures, or otherwise composed of multiple non-linear transformations.
Deep learning is part of a broader family of machine learning methods based on learning representations of data. An observation (e.g., an image) can be represented in many ways such as a vector of intensity values per pixel, or in a more abstract way as a set of edges, regions of particular shape, etc. Some representations make it easier to learn tasks (e.g., face recognition or facial expression recognition) from examples. One of the promises of deep learning is replacing handcrafted features with efficient algorithms for unsupervised or semi-supervised feature learning and hierarchical feature extraction.
Research in this area attempts to make better representations and create models to learn these representations from large-scale unlabeled data. Some of the representations are inspired by advances in neuroscience and are loosely based on interpretation of information processing and communication patterns in a nervous system, such as neural coding which attempts to define a relationship between various stimuli and associated neuronal responses in the brain.
Neural network(s) may refer to:
The University of Toronto (U of T, UToronto, or Toronto) is a public research university in Toronto, Ontario, Canada, situated on the grounds that surround Queen's Park. It was founded by royal charter in 1827 as King's College, the first institution of higher learning in the colony of Upper Canada. Originally controlled by the Church of England, the university assumed the present name in 1850 upon becoming a secular institution. As a collegiate university, it comprises twelve colleges, which differ in character and history, each retaining substantial autonomy on financial and institutional affairs. It has two satellite campuses located in Scarborough and Mississauga.
Academically, the University of Toronto is noted for influential movements and curricula in literary criticism and communication theory, known collectively as the Toronto School. The university was the birthplace of insulin and stem cell research, and was the site of the first practical electron microscope, the development of multi-touch technology, the identification of Cygnus X-1 as a black hole, and the theory of NP-completeness. By a significant margin, it receives the most annual scientific research funding of any Canadian university. It is one of two members of the Association of American Universities located outside the United States.
Actors: Scott McKinley (actor), David Birdsell (actor), Claude Knowlton (actor), Ron Fernandez (actor), Kathleen Kim (actress), Kevin Fukunaga (producer), Susan Nichols (actress), Carlo Angelo (actor), Gwyn Chafetz (actress), Thomas Meyer (actor), Dan Malin (actor), Chaz West (actor), Patrick Yu (actor), Dan Malin (producer), Ray Vecchiola (director),
Genres: Drama, Romance, Short,Actors: Dave Grusin (composer), Lester Matthews (actor), Laurence Naismith (actor), Milton Parsons (actor), Eric Pohlmann (actor), Shirley Eaton (actress), Richard Thorpe (producer), Richard Thorpe (director), Arthur Malet (actor), Laurie Main (actor), Alex Cord (actor), Émile Genest (actor), Jo Eisinger (writer), Oscar Beregi Jr. (actor), Andre Philippe (actor),
Genres: Drama,Actors: Charles Wagenheim (actor), Kirk Douglas (actor), Martin Garralaga (actor), Chuck Hamilton (actor), Michael Kane (actor), George Kennedy (actor), Gena Rowlands (actress), Dan White (actor), Harry Lauter (actor), Walter Matthau (actor), Bill Bixby (actor), Carroll O'Connor (actor), Leon Barsha (editor), David Miller (director), William Schallert (actor),
Plot: In order to free his best friend Bondi, Jack Burns lets himself be imprisoned only to find out that Bondi does not want to escape. Thus Burns breaks out on his own and is afterwards being chased by sheriff Johnson with helicopters and jeeps.
Keywords: ambulance, anachronism, bar-fight, barbed-wire, barroom-brawl, based-on-novel, breaking-into-prison, cantina, changing-times, chewing-gumActors: George Offerman Jr. (actor), Stanley Blystone (actor), Warner Baxter (actor), Richard Carle (actor), Frank Conroy (actor), Sam Flint (actor), James Burke (actor), Carlton Griffin (actor), Gladden James (actor), Murray Kinnell (actor), Edward LeSaint (actor), Tom London (actor), Matt Moore (actor), Herbert Mundin (actor), Edward Peil Sr. (actor),
Plot: A successful writer of romance novels aimed at women is also a ladies' man who is carrying on an affair with his sexy next-door neighbor, but things get complicated when a pretty young aspiring writer falls in love with him, sets her sights on him and won't take no for an answer.
Keywords: affair, writerBrain & Cognitive Sciences - Fall Colloquium Series Recorded December 4, 2014 Talk given at MIT. Geoffrey Hinton talks about his capsules project.
Graduate Summer School 2012: Deep Learning, Feature Learning "Part 1: Introduction to Deep Learning & Deep Belief Nets" Geoffrey Hinton, University of Toronto Institute for Pure and Applied Mathematics, UCLA July 9, 2012 For more information: https://www.ipam.ucla.edu/programs/summer-schools/graduate-summer-school-deep-learning-feature-learning/?tab=overview
"Can the brain do back-propagation?" - Geoffrey Hinton of Google & University of Toronto Support for the Stanford Colloquium on Computer Systems Seminar Series provided by the Stanford Computer Forum. Speaker Abstract and Bio can be found here: http://ee380.stanford.edu/Abstracts/160427.html Colloquium on Computer Systems Seminar Series (EE380) presents the current research in design, implementation, analysis, and use of computer systems. Topics range from integrated circuits to operating systems and programming languages. It is free and open to the public, with new lectures each week. Learn more: http://bit.ly/WinYX5
From searching on Google to real-time translation, millions of people use deep learning every day, mostly without knowing it. It's a form of artificial intelligence designed to mimic the human brain. Geoffrey Hinton is a professor in the department of computer science at the University of Toronto. His work on deep learning has been snapped up by Google and is now being used to power its search engine. He joins The Agenda to discuss deep learning and the future of artificial intelligence.
When Geoffrey Hinton, a researcher at Google and professor emeritus at the University of Toronto, began his work in deep learning in the 1970s, he was told he would spend his life toiling away in obscurity. Deep learning is a form of artificial intelligence that mimics the human brain. Now, four decades later, his research is revolutionizing AI. He joins The Agenda to discuss his work and what kept him going.
Deep Learning pioneer, Professor Geoff Hinton discusses neural nets for perception and language understanding at the Creative Destruction Lab Machine Learning and Market for Intelligence conference (Toronto, December 2015).
Stochastic gradient descent in multilayer networks of neuron-like units has led to dramatic recent progress in a variety of difficult AI problems. Now that we know how effective backpropagation can be in large networks it is worth reconsidering the widely held belief that the cortex could not possibly be doing backpropagation. Drawing on joint work with Timothy Lillicrap, I will go through the main objections of neuroscientists and show that none of them are hard to overcome if we commit to representing error derivatives as temporal derivatives. This allows the same axon to carry information about the presence of some feature in the input and information about the derivative of the cost function with respect to the input to that neuron. It predicts spike-time dependent plasticity and it al...
Deep Learning pioneer, Professor Geoff Hinton discusses neural nets for perception and language understanding. December 2015.
The Hinton train collision was a railway accident that occurred on February 8, 1986. Twenty-three people were killed in a collision between a Canadian National Railway freight train and a Via Rail passenger train. It was the deadliest Canadian rail disaster since the Dugald accident of 1947 which killed thirty-one people, and would not be surpassed until the Lac-Mégantic rail disaster in 2013 which killed forty-seven. It was surmised that the accident was a result of the crew of the freight train becoming incapacitated, and the resulting investigations revealed serious flaws in CN's employee practices. Having a bad day? I bet we have worse ones for you. Sound off in the comments on your thoughts and what you'd like to see next! ----------------------------------------------------------...
From searching on Google to real-time translation, millions of people use deep learning every day, mostly without knowing it. It's a form of artificial intelligence designed to mimic the human brain. Geoffrey Hinton is a professor in the department of computer science at the University of Toronto. His work on deep learning has been snapped up by Google and is now being used to power its search engine. He joins The Agenda to discuss deep learning and the future of artificial intelligence.
When Geoffrey Hinton, a researcher at Google and professor emeritus at the University of Toronto, began his work in deep learning in the 1970s, he was told he would spend his life toiling away in obscurity. Deep learning is a form of artificial intelligence that mimics the human brain. Now, four decades later, his research is revolutionizing AI. He joins The Agenda to discuss his work and what kept him going.
Recorded in May, 2015. When Geoffrey Hinton, a researcher at Google and professor emeritus at the University of Toronto, began his work in deep learning in the 1970s, he was told he . From searching on Google to real-time translation, millions of people use deep learning every day, mostly without knowing it. It's a form of artificial intelligence . Can the brain do back-propagation? - Geoffrey Hinton of Google & University of Toronto Support for the Stanford Colloquium on Computer Systems Seminar .
The godfather of neural networks doing experiments, finds the meaning of life by accident. Excerpt from: Geoffrey Hinton: "Some Applications of Deep Learning" https://www.youtube.com/watch?v=vYmclORWw5U "The meaning of life is the tradition of the ancient human reproduction: it is less favorable to the good boy for when to remove her bigger."
"Can the brain do back-propagation?" - Geoffrey Hinton of Google & University of Toronto Support for the Stanford Colloquium on Computer Systems Seminar Series provided by the Stanford Computer Forum. Speaker Abstract and Bio can be found here: http://ee380.stanford.edu/Abstracts/160427.html Colloquium on Computer Systems Seminar Series (EE380) presents the current research in design, implementation, analysis, and use of computer systems. Topics range from integrated circuits to operating systems and programming languages. It is free and open to the public, with new lectures each week. Learn more: http://bit.ly/WinYX5
Interview with Geoffrey Hinton, BBVA Foundation Frontiers of Knowledge Award in Information and Communication Technologies in 2016 for his pioneering and highly influential work on endowing machines with the ability to learn. His research applies to multiple fields, from machine translation and photo classification programs to speech recognition systems and personal assistants like Siri, by way of such headline developments as self-driving cars, or the search for molecules of interest in new drug discovery.
Geoff Hinton comments on radiology and deep learning at the 2016 Machine Learning and Market for Intelligence Conference in Toronto
Geoff Hinton is one of the first researchers who demonstrated the use of generalized backpropagation algorithm for training multi-layer neural nets and is an important figure in the deep learning, machine learning, computer vision, community. He is planning to "divide his time between his Toronto university research and his work at Google".
Brain & Cognitive Sciences - Fall Colloquium Series Recorded December 4, 2014 Talk given at MIT. Geoffrey Hinton talks about his capsules project.
The BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies category goes, in this ninth edition, to artificial intelligence researcher Geoffrey Hinton, “for his pioneering and highly influential work” to endow machines with the ability to learn.
Pretty blond hair swinging as she walks
Child-like smile glistens as she talks
Dangerous eyes in a small town girl
Seen everything all around the world
If I could only make her see my way
Maybe we could spend a while some day
in a place far away from here
(repeat verse)
Hey, Hey, Nadine
She's a good girl
Hey, Hey, Nadine
She's got me in a whirl
Hey, Hey, Nadine
She's okay
(repeat verse)
Hey, Hey, Nadine
She's a good girl
Hey, Hey, Nadine
She's got me in a whirl
Hey, Hey, Nadine
She's such a pretty girl
(repeat verse)
(repeat second chorus)
(repeat first chorus)