- info@parsalandco.com
- +989124000464
- +989127093613
Ai-Da was invented by gallerist Aidan Meller in collaboration with Engineered Arts, a Cornish robotics company. Her drawing intelligence was developed by computer AI researchers at the University of Oxford and her drawing arm was developed by Salaheldin Al Abd and Ziad Abass, undergraduate students from the School of Electronic and Electrical Engineering at the University of Leeds
Artificial intelligence has been a part of our regular life for decades now. While there is still an air of the technological future to it and a wealth of detractors and those who fear its influence, it has already integrated into our society. And as it has also intermingled with artistic practices for some time now, projects combining the two realms are not a surprising thing to hear of touring internationally. What is surprising is seeing artist-robot Ai-Da detained at the Egyptian border upon suspicion of espionage.
allery Director Aidan Meller is a specialist in modern and contemporary art and runs a gallery internationally. With over 20 years’ experience in the art business, he works closely with private collectors and is often consulted by those who wish to begin, or further develop their collections. He regularly has original works by the likes of Picasso, Matisse, Chagall, to older works such as John Constable, Turner and Millais. Aidan is the visionary mind behind Ai-Da Robot Artist
Ai-Da is the world’s first ultra-realistic artist robot. She draws using cameras in her eyes, her AI algorithms, and her robotic arm. Created in February 2019, she had her first solo show at the University of Oxford, ‘Unsecured Futures’, where her art encouraged viewers to think about our rapidly changing world. She has since travelled and exhibited work internationally, and had her first show in a major museum, the Design Museum, in 2021. She continues to create art that challenges our notions of creativity in a post-humanist era.
Brush clamped firmly in bionic hand, Ai-Da’s robotic arm moves slowly, dipping in to a paint palette then making slow, deliberate strokes across the paper in front of her.
This, according to Aidan Meller, the creator of the world’s first ultra-realistic humanoid robot, Ai-Da, is “mind-blowing” and “groundbreaking” stuff.
In a small room at London’s British Library, Ai-Da – assigned the she/her pronoun – has become the first robot to paint as artists have painted for centuries.
Ai-Da takes more than five hours to make a painting, but no two works are exactly the same.
Camera eyes fixed on her subject, AI algorithms prompt Ai-Da to interrogate, select, decision-make and, ultimately, create a painting. It’s painstaking work, taking more than five hours a painting, but with no two works exactly the same.
Yet the question Meller wants to raise with this, the first public demonstration of a creative, robotic painting, is not “can robots make art?”, but rather “now that robots can make art, do we humans really want them to?”
“We haven’t spent eye-watering amounts of time and money to make a very clever painter,” said Meller. “This project is an ethical project.”
With rapidly developing artificial intelligence, growing accessibility to super computers and machine learning on the up, Ai-Da – named after the computing pioneer Ada Lovelace – exists as a “comment and critique” on rapid technological change.
Ask Ai-Da – and yes, the Guardian did ask pre-submitted questions for her to answer – what she thinks of art, her sophisticated language program is like Siri on steroids.
She tells you she used machine learning to teach her to paint “which is different to humans”. Can she paint from imagination? “I like to paint what I see. You can paint from imagination, I guess, if you have an imagination. I have been seeing different things to humans as I do not have consciousness,” she responded in stilted fashion.
Can she appreciate art or beauty? “I do not have emotions like humans do, however, it is possible to train machine learning system to learn to recognise emotional facial expressions,” she answered. The artists she most admires are Yoko Ono, Doris Salcedo, Michelangelo and Wassily Kandinsky.
But, can what she creates be truly considered art? “The answer to that question depends on what you mean by art,” she said, adding: “I am an artist if art means communicating something about who we are and whether we like where we are going. To be an artist is to illustrate the world around you.”
Devised in Oxford by Meller, Ai-Da was created more than two years ago by a team of programmers, roboticists, art experts and psychologists, completed in 2019, and is updated as AI technology improves. She has already demonstrated her ability to sketch and create poems.
Her new painting talent was unveiled ahead of the world premier of her solo exhibition at the 2022 Venice Biennale, which opens to the public on 22 April.
Titled Leaping into the Metaverse, Ai-Da Robot’s Venice exhibition will explore the interface between human experience and AI technology, from Alan Turing to the metaverse, and will draw on Dante’s concepts of purgatory and hell to explore the future of humanity in a world where AI technology continues to encroach on everyday human life.
Soon, with the amount of data we freely give about ourselves, and through talking to our phones, computers, cars and even kitchen appliances, AI algorithms “are going to know you better than you do”, Meller warned.
We are entering a world, he said, “not understanding which is human and which is machine”.
In May 2019, Ai-Da executed a live performance called Privacy at St Hugh’s College, Oxford. This work was a homage to Yoko Ono’s seminal work Cut Piece.
In June 2019, Ai-Da’s artworks were featured in a gallery show called Unsecured Features at St John’s College, Oxford.
In October 2019, Ai-Da collaborated with artist Sadie Clayton on a series of workshops at Tate Exchange, Tate Modern, London – Exploring Identity Through Technology – hosted by A Vibe Called Tech.
In June 2019, Ai-Da featured at the Barbican Centre, London, in WIRED Pulse: AI.
In September 2019, Ai-Da was invited to Ars Electronica, Linz, Austria: European ARTificial Intelligence Lab exhibition entitled Out of the Box: The Midlife Crisis of the Digital Revolution.
In November, 2019, Ai-Da was invited to do a series of workshops at Abu Dhabi Art in Manarat Al Saadiyat, UAE.
In December 2019, Ai-Da had her first in-depth interview with Tim Marlow, the Artistic Director at the Royal Academy, at the Sarabande (Alexander McQueen Foundation), London, Inspiration Series.
In February 2020, Ai-Da did her first TEDx talk in Oxford called The Intersection of Art and AI
In July 2020, Ai-Da featured in The 1975’s music video for their song “Yeah I Know”, from their album Notes on a Conditional Form. In the video she was tasked with drawing what she thought the human consciousness looked like and composing a poem in response to the song lyrics.
In October 2020, Ai-Da was featured by the United Nations in a virtual exhibition launched by The World Intellectual Property Organization (WIPO) launched “WIPO: AI and IP, A Virtual Experience.
In May 2021, Ai-Da did her first artist residency at Porthmeor Studios, St Ives. She worked in Studio 5 in response to Ben Nicholson’s art, who worked in the same space during the 1930s and 1940s.
In May 2021, Ai-Da’s display Ai-Da: Portrait of the Robot was at the Design Museum, London.
In October 2021, while entering Egypt for an exhibition at the Great Pyramid of Giza, Ai-Da was held for ten days by border guards who “feared her robotics may have been hiding covert spy tools”
On 22 April 2022, Ai-Da was scheduled to launch the world premiere of her solo exhibition titled “Leaping into the Metaverse” at the 59th Venice Biennale.
Ai-Da’s arms have been created in collaboration with Salaheldin Al Abd and Ziad Abass, both undergraduate students from the School of Electronic and Electrical Engineering here at the University of Leeds.
Ziad is a student studying an undergraduate degree in Mechatronics and Robotics. Along with his fellow classmate Salah Al Abd, both Ziad and Salah designed Ai-Da’s drawing arm and developed the AI algorithem used by the world’s first ultra-realistic humanoid AI robot artist. During his third year, Ziad undertook a placement year at IBM, working as a DevOps Software Engineer.
After doing some research and asking friends, Ziad found that the University of Leeds stood out amonst the rest, as he saw the University as the perfect setting for him, a place that creates an optimum balance between the academic and the social aspects of the university experience.
“The future of almost every industry involves robotics, mechatronics or machine intelligence. I chose to study Mechatronics and Robotics at Leeds because the course not only manages to integrate mechanical systems with electrical systems, but also digs deep into computer science fundamentals which mechatronic systems would be useless without. Furthermore, I have always been interested in understanding how intricate machines and softwares function and I felt that this course would help me develop this understanding and prepare me to be an innovator of my own.”
I chose to study Mechatronics and Robotics at Leeds because the course not only manages to integrate mechanical systems with electrical systems, but also digs deep into computer science fundamentals which mechatronic systems would be useless without.
“One of the best aspect of the course has been the opportunity to work on projects. Project work are given from as early as year one, which, for me, was crucial to develop a deep understanding of what was being taught in lectures. For example, as a team of 6, we designed and programmed an autonomous mechatronic arm using SolidWorks and built it from scratch. The objective was to create a proof of concept for an arm that can guide a micro robot inside the body from the outside, making colonoscopy procedures non-invasive and much easier and accurate. Our team achieved third place in the competition.
More recently, I worked on developing an interface system based on neuromuscular signals from hand/ forearm gestures to be employed in stroke rehabilitation. I developed a wearable sleeve that can capture muscle signals from the forearm through dry, 3D-printed electrodes and interpret them. Machine learning was then employed to train a classifier that can realise which hand gesture has been performed based on the muscle signals read.
Project work are given from as early as year one, which, for me, was crucial to develop a deep understanding of what was being taught in lectures.
Ziad Abass
I also enjoyed the lack of monotony in the structure of the course. In terms of teaching, it focuses on only the most practical modules from each of the three involved schools (Electronic and Electrical Engineering, Mechanical Engineering and Computing), therefore helping one focus on what really matters. In terms of assessment, the methods varied greatly from module to module, and this minimised the tediousness of simply sitting countless tests and exams.”
Named after scientist Ada Lovelace, Ai-Da, the world’s first ultra-realistic humanoid AI robot artist was invented by gallery director Aidan Meller. As an artist, Ai-Da draws, sculpts and creates collaborative paintings with human.
Along with his fellow classmate Salah Al Abd, Ziad and Salah designed Ai-Da’s drawing arm and developed the AI algorithms used by Ai-Da to create human drawing style inspired by 20th century masters including Pablo Picasso and German painter Max Beckmann.
The fact that we were implementing our skillset from university into such a big, relevant project taught us a lot about how the real world works and how to professionally deal with situations beyond the lecture theatre. This made us readier than ever to think even bigger and become part of more amazing projects in the future.
Ziad Abass
“We implemented smart algorithms that use computer vision to analyse who Ai-Da sees, then developed a control system that produces a path for her arm to follow, based on her interpretation. We then worked alongside Engineered Arts (the company that built the rest of Ai-Da) to integrate our system with their existing one. We created a critical part of a robot that is the first of its kind. People and news agencies around the world were thrilled to find out about Ai-Da who blurs the line between AI and creativity. The artwork created by her is made to promote discussion about the technological revolution we are in as well as the possible futures of AI.
Ziad Abass
Both Salah and Ziad designed Ai-Da’s drawing arm and developed the AI algorithms used by Ai-Da to create human drawing style inspired by 20th century masters including Pablo Picasso and German painter Max Beckmann.
, “We implemented smart algorithms that use computer vision to analyse who Ai-Da sees, then developed a control system that produces a path for her arm to follow, based on her interpretation. We then worked alongside Engineered Arts (the company that built the rest of Ai-Da) to integrate our system with their existing one. We created a critical part of a robot that is the first of its kind. People and news agencies around the world were thrilled to find out about Ai-Da who blurs the line between AI and creativity. The artwork created by her is made to promote discussion about the technological revolution we are in as well as the possible futures of AI.”
Ziad
We implemented smart algorithms that use computer vision to analyse who Ai-Da sees, then developed a control system that produces a path for her arm to follow, based on her interpretation
Source : .intelligenthq _ en.wikipedia.org _ ai-darobot _ .theguardian _ eps.leeds.ac.uk _ time _ eps.leeds.ac.uk
Related Post
Parsaland Trading Company with many activities in the fields of import and export, investment consulting, blockchain consulting, information technology and building construction