5 Roles That Artificial Intelligence A Game Changer In Education Industry

Source

Science fiction writers, futurists, and filmmakers have been predicting for decades that the spectacular (and sometimes catastrophic) changes that may arise with the advent of widespread artificial intelligence.

The educational world is becoming more comfortable and personalized thanks to the many applications of AI for education. It has transformed the process people learn, as educational materials are available to everyone through smart devices and computers.

Now, students do not have to visit physical classes to study as long as they have computers and an internet connection. Artificial Intelligence also enables the self-regulation of administrative tasks, enabling organizations to reduce the time needed to complete difficult tasks so that educators can spend more time with students. It is time to discuss the transformations that AI has brought in education

Although we have not yet created self-aware robots such as Pepper Popular Movies 2001: A Space Odyssey and Star Wars, we have made smart and frequent use of AI technology in a wide range of applications that, while not mind-blowing as androids, still change our daily lives. One spot where artificial intelligence is ready to perform big changes (and in some cases already) is in education.

While we may not see humanoid robots acting as teachers in the next decade, there are already many projects that use computer intelligence to help students and teachers gain more educational experience. Those tools are just a few of the paths here, and those that follow will shape and define the future educational experience.

1. Artificial intelligence automates basic activities in education such as grading.

In college, grading homework and testing for large lecture courses can be tedious, even when TAs are split between them. Even in the lower grades, teachers often find that grading takes considerable time and effort to communicate with students, prepare for class, or strive for professional development.

AI may never really replace human grading, but it’s pretty close. It is now possible for teachers to automate grading for almost all types of multiple-choice, and the automatic testing of blank testing and student writing may not be far behind. Today, Essay-grading software is still in its infancy and is not quite the same, although it will improve (and will) in the coming years, enabling teachers to focus more on class activities and student interactions than grading.

2. Educational software is tailored to the needs of students.

From kindergarten to graduate school, one of the key ways to influence artificial intelligence education is to use higher levels of personalized learning. Some of this is already happening in the growing number of custom learning programs, games and software. These systems respond to the needs of the student, put too much emphasis on certain topics, repeat things that students are not familiar with, and generally help students work at their own pace.

Jobs in AI

To Read More: AI In Education: Top 12 AI Applications in Education Industry

This sort of custom-tailored learning can be a machine-assisted solution to help students of different levels work together in a classroom, while teachers facilitate learning and provide help and assistance when needed. Adaptive training has previously made a huge impact on education across the country (especially through programs like the Khan Academy), and as AI develops over the coming decades, such adaptive programs will improve and expand.

3. It can point out areas where courses need to be improved.

Teachers may not always be aware of the gaps in their lectures and educational materials, which may confuse students about certain topics. Artificial intelligence provides a way to solve that problem. Coursera, a large open online course provider, is already putting it into practice. When a large number of learners submit a wrong answer to a homework assignment, the system alerts the teacher and gives prospective students a customized message that hints at the correct answer.

This type of system helps fill in the gaps in interpretation that occur in courses and helps ensure that students are building the same conceptual foundation. Instead of waiting to hear back from the professor, students get immediate feedback, which helps them understand a concept and remember exactly how to do it next time.

Trending AI Articles:

1. Introducing Open Mined: Decentralised AI

2. Only Numpy: Implementing Convolutional Neural Network using Numpy

3. TensorFlow Object Detection API tutorial

4. Artificial Intelligence Conference

4. Students may receive additional support from AI tutors.

While it is clear that human tutors can provide those machines, at least not yet, most students in the future will be taught by tutors who have zeros and only them. There are already some training programs based on Artificial Intelligence and help students with basic math, writing and other subjects.

Is Artificial Intelligence A Game Changer In Education Industry, To Read More……

These programs can teach students the fundamentals, but so far, they are not ideal for helping students learn high-order thinking and creativity, and are still needed to facilitate real-world teachers. It should not rule out the possibility that AI tutors will do these things in the future. With the rapid pace of technological advances that have been recognized over the past few decades, advanced training systems may not be the pipe dream.

5. AI-driven programs can provide helpful feedback to students and faculty.

AI not only helps teachers and students to create courses that are customized to their needs but also provide feedback on both the success of the entire course. Some institutions, especially online offerings, use AI systems to monitor student progress and alert professors when there is a problem with student performance.

These types of AI in education systems allow students to get the help they need and find areas where professors can improve instruction for students who are struggling with the subject matter. However, the AI programs in these schools do not offer advice on individual courses. Some students are working to develop systems that help them choose majors based on areas of success and difficulty. Students do not need to take advice, it represents a brave new world of college major choice for future students.

Don’t forget to give us your ? !


5 Roles That Artificial Intelligence A Game Changer In Education Industry was originally published in Becoming Human: Artificial Intelligence Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

Via https://becominghuman.ai/5-roles-that-artificial-intelligence-a-game-changer-in-education-industry-a265292ef807?source=rss—-5e5bef33608a—4

source https://365datascience.weebly.com/the-best-data-science-blog-2020/5-roles-that-artificial-intelligence-a-game-changer-in-education-industry

A Concise Course in Statistical Inference: The Free eBook

Check out this freely available book, All of Statistics: A Concise Course in Statistical Inference, and learn the probability and statistics needed for success in data science.

Originally from KDnuggets https://ift.tt/2KBGo2O

source https://365datascience.weebly.com/the-best-data-science-blog-2020/a-concise-course-in-statistical-inference-the-free-ebook

Top Stories Apr 20-26: The Super Duper NLP Repo; Free High-Quality Machine Learning & Data Science Books & Courses

Also: Should Data Scientists Model COVID19 and other Biological Events; 5 Papers on CNNs Every Data Scientist Should Read; 24 Best (and Free) Books To Understand Machine Learning; Mathematics for Machine Learning: The Free eBook; Find Your Perfect Fit: A Quick Guide for Job Roles in the Data World

Originally from KDnuggets https://ift.tt/2Y6pXDP

source https://365datascience.weebly.com/the-best-data-science-blog-2020/top-stories-apr-20-26-the-super-duper-nlp-repo-free-high-quality-machine-learning-data-science-books-courses

How to Transition to Data Science from Economics?

transition to data science from economics, from economist to data scientist

Why transition to Data Science from Economics?

Have you ever wondered “So, what’s next for me?”

Well, you’re not alone! Many graduates aren’t too sure what they want to do after graduation. That’s especially true for Econ majors. Trust me – I am one.

And one of the often-overlooked options is data science.

So, in this article, I’ll tell you how to transition to data science from economics.

I’ll examine the good, the bad, and the ugly; answer some of the most important questions running through your mind, like: “Can I”, “Should I” and “How can I” make this switch. And I’ll explain the pros and cons before finding the best way to transition to data science from economics.

How to Transition to Data Science From Economics

Let’s start with “Can I make the switch?”

The answer here is a resounding “Yes!”.

Roughly 13% of current data scientists have an Economics degree. For comparison, the most well-represented discipline is data science and analysis, which takes up 21% of the pie. Therefore, Economics is indeed a competitive discipline when it comes to data science.

This isn’t at all surprising for several reasons.

First, unlike STEM disciplines, social studies help develop great presentational skills that are essential for any data scientist.

Through presentations and open discussions, students learn how to present a topic, as well as argue for or against a given statement. These activities result in developing a confident and credible way of showcasing actionable insights. Moreover, most econ majors deeply care about human behavior and response to different stimuli.

Hence, social-studies majors can capably serve as mediators between the team and management.

transition to data science from economics, great presentational skills

Second, economists often have a different approach than Computer Science or Data Science majors.

Due to their superior understanding of causal relations, social-studies graduates can add another perspective when looking at the data and the results. This is extremely important because their casual inference allows them to think beyond the numbers and extract actionable insights.

different approach of econ majors

Furthermore, Economics frequently intertwines with Mathematics, Finance, Psychology, and Politics.

Therefore, an economist’s approach is always meant to be interdisciplinary.

Finally, the technical capabilities of an economist are often quite impressive.

An average economist has a good understanding of Machine Learning without really referring to it as such. Linear regressions and logistic regressions are studied in almost all economics degrees.

transition to data science from economics, technical capabilities

I think we are pretty convinced about the “Can I” part. So, let’s move to the “Should I” part.

Should I transition to Data Science from Economics?

Well, the answer here is “Yes” – with a very small asterisk next to it.

Now, any Economics graduate possesses many of the required skills to transition into Data Science, but that doesn’t necessarily suggest they should do it… They might be more suited for something else.

For example, an Economics graduate with an affinity for Political science will most likely thrive better in a policy advisory role in a bank or hedge fund or even in a government position. Similarly, less-coding-savvy social-studies graduates are a finer fit for data analyst positions, where machine learning algorithms are relied upon less frequently. It’s not that either one wouldn’t be able to succeed as a data scientist, but their skills are better suited for different career paths.

transition to data science from economics, different career paths

So, let’s look at the question like an economist would – through the lens of incentives.

Where does one find the incentives? That’s right – in a job ad.

The main components of a job ad are the level of education, years of experience, and indispensable skills.

transition to data science from economics, job requirements

We already discussed how popular Economics is compared to STEM degrees, so you know it’s a good choice for a potential career as a Data Scientist. When it comes to economics degrees, 43% of the job ads in our research require a BA and an additional 40% a Master’s. Hence, due to the interdisciplinary nature of social sciences, you don’t need to get a doctorate to be successful in the field.

As for years of experience, if you’re transitioning from another position in business, you’ve probably had to do some analytical thinking already.

Usually, 3 to 4 years in such a setting are enough to ensure a smooth transition. But this is tightly related to your level of education. A Master of Science will need 2 fewer-years of experience in a business setting due to their additional academic qualifications.

However, if you’re trying to make a transition straight out of college, you might want to go for an entry-level job in the field.

transition to data science from economics, experience needed to get a data science job

When it comes to skills, one of the key parts is understanding statistical results and their implications.

Luckily, economics degrees are often based on statistical study cases and experiments, so you should feel comfortable interpreting the results. Of course, this expands to understanding the intuition behind machine learning algorithms and their limitations. As we already stated, Econometrics incorporates linear and logistic regressions, so Economics graduates have a great grasp of the intuition behind Machine Learning models.

Additional skills listed in such job ads include problem solving and strong analytical thinking.

A lot of economics degrees heavily rely on examining study cases, solving practical examples, and analyzing published papers, so you probably possess these qualities already.

transition to data science from economics, skillset

Of course, communication skills are essential when working in a team.

As mentioned earlier, Economics graduates often serve as a bridge between the data science team and higher management.

Lastly, anybody making the switch to data science needs a certain coding pedigree.

Whether it’s R, Python, or both, knowing how to use such software is a must if you want to succeed in the field.

If you’re an Economist in your 20s, we can assume you have seen some Python or R code. Hence, you only need to gather more work experience in a business setting.

If you are above 30 and you aren’t a Computer Science graduate, you most probably didn’t use the computer in your university classes. So, you may think your main challenge is the lack of programming skills. But that shouldn’t be the case.

Just focus on the technical part – programming and the latest software technologies.

Coding has never been easier, and anyone can learn. Especially a person from an economics background. We all know you have seen some very complicated stuff.

transition to data science from economics, coding skills, programming skills, Python, R

We answered the “can” and “should” parts of the discussion, so let’s dive into the “how-to” part.

How can I transition to Data Science from Economics?

There are generally 4 crucial things you need to do to make the switch.

The first one is picking your spot.

As discussed, there is plenty of room for Economics graduates in data science. All you need to make sure you’re ready to fit exactly that role and demonstrate your strengths.

Employers value your understanding of causal inference, so you need to highlight that in your application.

Showcase the analytical part of your work. Mention insights you gained through research or academic work and quote their measurable impact. These bring credibility and provide recruiters with a glimpse of what they’ll be getting once they hire you.

job application

Second  – use your social science advantage.

By knowing how surveys and experiments are constructed, you know where to look when examining the results. You see beyond the data and understand which Machine Learning approach should work best in each case.

In contrast, Data Science and Computer Science graduates often have a mindset of “How can I pre-process the data before I run a machine learning algorithm?”, instead of looking at the way the data was gathered. Your understanding of collinearity, reverse causality, and biases can help you accurately quantify interdependence within the data. Thus, you can have great synergy with the rest of the members on your team.

understanding of collinearity, reverse causality, and biases

The third and most crucial change you need to make is to adapt your way of thinking.

Even though the cause & effect mentality will help you settle in your career, you need to be able to look for other things as well. The findings of Neural Networks algorithms can be confusing because they discover patterns rather than causal links. Hence, you need to be ready to demonstrate flexibility in your thinking and adjust accordingly.

Of course, this isn’t a change that can happen overnight, but rather one that happens gradually with experience.

Last but not least, you’ll need to learn a programming language or BI software.

Lucky for you, programming languages such as Python and R aren’t that hard to learn. And once you’re fluent in one programming language, you can easily master another one, despite coming from an economics background.

This also falls into the “learn as we go” area, so just make sure to be proficient in at least one of either Python or R, and your transition into the field should be smooth as butter.

All things considered, Economics majors can, and should, try to pursue a career in data science because they have the necessary skills and there is high market demand. Surely, economics skills are mandatory for any data science team. Thus, there is no doubt that you, dear Econ major, could be that person.

Ready to take the next step towards a data science career?

Check out the complete Data Science Program today. Start with the fundamentals with our Statistics, Maths, and Excel courses. Build up a step-by-step experience with SQL, Python, R, Power BI, and Tableau. And upgrade your skillset with Machine Learning, Deep Learning, Credit Risk Modeling, Time Series Analysis, and Customer Analytics in Python. Still not sure you want to turn your interest in data science into a career? You can explore the curriculum or sign up 12 hours of beginner to advanced video content for free by clicking on the button below.

The post How to Transition to Data Science from Economics? appeared first on 365 Data Science.

from 365 Data Science https://ift.tt/35h1ehK

Google Open Sources SimCLR A Framework for Self-Supervised and Semi-Supervised Image Training

The new framework uses contrastive learning to improve image analysis in unlabeled datasets.

Originally from KDnuggets https://ift.tt/3aJ4wLy

source https://365datascience.weebly.com/the-best-data-science-blog-2020/google-open-sources-simclr-a-framework-for-self-supervised-and-semi-supervised-image-training

What Is a Tensor?

what is a tensor, tensor, tensors

Tensors have been around for nearly 200 years. In fact, the first use of the word ‘tensor’ was introduced by William Hamilton. Interestingly, the meaning of this word had little to do with what we call tensors from 1898 until today.

How did tensors become important you may ask? Well, not without the help of one of the biggest names in science – Albert Einstein! Einstein developed and formulated the whole theory of ‘general relativity’ entirely in the language of tensors. Having done that, Einstein, while not a big fan of tensors himself, popularized tensor calculus to more than anyone else could ever have.

Nowadays, we can argue that the word ‘tensor’ is still a bit ‘underground’. You won’t hear it in high school. In fact, your Math teacher may have never heard of it. However, state-of-the-art machine learning frameworks are doubling down on tensors. The most prominent example being Google’s TensorFlow.

What is a tensor in Layman’s terms?

The mathematical concept of a tensor could be broadly explained in this way.

A scalar has the lowest dimensionality and is always 1×1. It can be thought of as a vector of length 1, or a 1×1 matrix.

It is followed by a vector, where each element of that vector is a scalar. The dimensions of a vector are nothing but Mx1 or 1xM matrices.

Okay.

Then we have matrices, which are nothing more than a collection of vectors. The dimensions of a matrix are MxN. In other words, a matrix is a collection of n vectors of dimensions m by 1. Or, m vectors of dimensions n by 1.

Furthermore, since scalars make up vectors, you can also think of a matrix as a collection of scalars, too.

Now, a tensor is the most general concept.

Scalars, vectors, and matrices are all tensors of ranks 0, 1, and 2, respectively. Tensors are simply a generalization of the concepts we have seen so far.

tensor 1x1, tensor mx1, tensor mxn, tensor kxmxn

An object we haven’t seen is a tensor of rank 3. Its dimensions could be signified by k,m, and n, making it a KxMxN object. Such an object can be thought of as a collection of matrices.

How do you ‘code’ a tensor?

Let’s look at that in the context of Python.

In terms of programming, a tensor is no different than a NumPy ndarray. And in fact, tensors can be stored in ndarrays and that’s how we often deal with the issue.

Let’s create a tensor out of two matrices.

Our first matrix m1 will be a matrix with two vectors: [5, 12, 6] and [-3, 0, 14].

The matrix m2 will be a different one with the elements: [9, 8, 7] and [1, 3, -5].

how do you code a tensor

Now, let’s create an array, T, with two elements: m1 and m2.

After printing T, we realize that it contains both matrices.

matrices

It is a 2x2x3 object. It contains two matrices, 2×3 each.

Alright.

If we want to manually create the same tensor, we would need to write this line of code.

t_manual

As you can imagine, tensors with lots of elements are very hard to manually create. Not only because there are many elements, but also because of those confusing brackets.

Usually, we would load, transform, and preprocess the data to get tensors. However, it is always good to have the theoretical background.

Why are tensors useful in TensorFlow?

After this short intro to tensors, a question still remains – why TensorFlow is called like that and why does this framework need tensors at all.

First of all, Einstein has successfully proven that tensors are useful.

Second, in machine learning, we often explain a single object with several dimensions. For instance, a photo is described by pixels. Each pixel has intensity, position, and depth (color). If we are talking about a 3D movie experience, a pixel could be perceived in a different way from each of our eyes. That’s where tensors come in handy – no matter the number of additional attributes we want to add to describe an object, we can simply add an extra dimension in our tensor. This makes them extremely scalable, too.

Finally, we’ve got different frameworks and programming languages. For instance, R is famously a vector-oriented programming language. This means that the lowest unit is not an integer or a float; instead, it is a vector. In the same way, TensorFlow works with tensors. This not only optimizes the CPU usage, but also allows us to employ GPUs to make calculations. What’s more, in 2016 Google developed TPUs (tensor processing units). These are processors, which consider a ‘tensor’ a building block for a calculation and not 0s and 1s as does a CPU, making calculations exponentially faster.

So, tensors are a great addition to our toolkit, if we are looking to expand into machine and deep learning. If you want to get into that, you can learn more about TensorFlow and the other popular deep learning frameworks here.

Ready to take the next step towards a data science career?

Check out the complete Data Science Program today. Start with the fundamentals with our Statistics, Maths, and Excel courses. Build up a step-by-step experience with SQL, Python, R, Power BI, and Tableau. And upgrade your skillset with Machine Learning, Deep Learning, Credit Risk Modeling, Time Series Analysis, and Customer Analytics in Python. Still not sure you want to turn your interest in data science into a career? You can explore the curriculum or sign up 12 hours of beginner to advanced video content for free by clicking on the button below.

 

The post What Is a Tensor? appeared first on 365 Data Science.

from 365 Data Science https://ift.tt/3aLphpT

Learning during a crisis (Data Science 90-day learning challenge)

How can you keep your focus and drive during a global crisis? Take on a 90-day learning challenge for data science and check out this list of books and courses to follow.

Originally from KDnuggets https://ift.tt/3cNFHzN

source https://365datascience.weebly.com/the-best-data-science-blog-2020/learning-during-a-crisis-data-science-90-day-learning-challenge

The Super Duper NLP Repo: 100 Ready-to-Run Colab Notebooks

Check out this repository of more than 100 freely-accessible NLP notebooks, curated from around the internet, and ready to launch in Colab with a single click.

Originally from KDnuggets https://ift.tt/2VAiQ52

source https://365datascience.weebly.com/the-best-data-science-blog-2020/the-super-duper-nlp-repo-100-ready-to-run-colab-notebooks

How to Create Training Data Set for Machine Learning & AI in Agriculture?

AI is already playing a significant role in various fields, similarly, agriculture is the field where it can be implemented through various applications, system or machines that can perform various actions independently or analyze the useful data for better farming and agriculture.

In agriculture you can use automated tractors, drones, robots and other computer vision based machines to visualize the various scenario and help agriculture sector boos the productivity. Robots or AI drones are trained through machine learning algorithms, and to train the algorithms, you need certain amount of data sets containing the annotated images of crops and harvestings.

Training Data Set for Machine Learning & AI in Agriculture

To train the robots, you need the data set that contains the object of interest which could be either crops, fruits and vegetables are annotated with various techniques like bounding box to detect the object precisely. Anolytics provides the bounding box annotation for crop detection, fruit detection and unwanted crops in agriculture creating the high-quality data set for robots and autonomous machines.

Training Data for Robots in Agriculture and Farming

You can create the set of data for AI robots in agriculture with annotated fruits, vegetables showing the actual condition of such plants, and take action like plucking, spraying pesticides or detecting the weeds or unwanted crops to remove them from the fields that are eating the nutrition of soil. Using the training data sets for harvesting and crop controlling is possible when you create a right data.

Training Data for Aerial View Mapping of Agricultural Field

Another type of data you need for AI in agriculture is for drones that can provides the useful details of agricultural fields to check the soil condition through geo sensing and monitor the health of the crops. In farming sector, AI drones can monitor the live stocks like cow, buffalos, sheep and other creatures used in animal husbandry. Only such high-quality training data can help drones to learn from such data sets.

Training Data for Live Stock Management in Farming

For livestock management drones are used to monitor the animals grazing the grass in the open field. Hence, to make these objects and animals, recognizable to machines (drones) you need to create the data set containing the annotated fields and different types of animals. Semantic segmentation and polygon annotation are the image annotation technique used to create the training data for AI.

Trending AI Articles:

1. Making a Simple Neural Network

2. Google will beat Apple at its own game with superior AI

3. The AI Job Wars: Episode I

4. Artificial Intelligence Conference

Similarly, drones and robots can be trained to perform various tasks like spraying the pesticides, aerial view monitoring of crops to prevent from harmful insects and other animals. Similarly, AI based various applications can also provide the details to grow the crop in better way and improve the yield.

Making the all types of objects including crops, fruits, vegetables and other things in the agricultural filed need to annotated and feed into the machine learning algorithms for agriculture to make the model visualize various situations and take the action accordingly. Semantic segmentation is also one of the most crucial image annotation technique for deep learning in agriculture.

Anolytics provide the high-quality training data for machine learning and AI in agriculture. It can provide you the annotated images of all types of objects in the agricultural field with best level of accuracy, It is specialized in image annotation services to provide the machine learning training data sets to develop the different types of AI models with high rate of success when used in real-life.

Don’t forget to give us your ? !


How to Create Training Data Set for Machine Learning & AI in Agriculture? was originally published in Becoming Human: Artificial Intelligence Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

Via https://becominghuman.ai/how-to-create-training-data-set-for-machine-learning-ai-in-agriculture-8febc02fda9a?source=rss—-5e5bef33608a—4

source https://365datascience.weebly.com/the-best-data-science-blog-2020/how-to-create-training-data-set-for-machine-learning-ai-in-agriculture

The Impending AI Pandemic

What COVID-19 teaches us about the existential threat of artificial intelligence.

Source

COVID-19, carbon-induced climate change, and nuclear winter carry existential risk, not due to their intelligence, but because they destroy large systems upon which we depend.

Our first existential digital threat is widely expected to be an artificial general intelligence, a grand AI capable of achieving any goal. Surprisingly, the real AI threat is not big and scary, like a monster, but small, systemic, and decentralized, like the coronavirus.

Replication

This threat is right under our noses. Internet worms, a type of computer virus that seems fairly innocuous, carry a creepy potential. These pieces of code autonomously spread from machine to machine. And they are already unstoppable.

Consider MyDoom, a worm unleashed in 2003. Samples of MyDoom were found in ~1% of all emails sent in 2018. Almost a trillion copies of this worm are scattered every year, waiting to be picked up by an unprotected host.

MyDoom exhibits considerable variation. As the worm spreads, it changes its code to avoid detection. This not only mimics the variation seen in biological evolution, but also provides a great foundation for evolutionary algorithms. Trillions of samples mean a juicy dataset, so many mutations can be tested and models can be trained.

Worm-style replication is so cheap and quick, it’s already a grand success. The classic sci-fi alternative — some kind of Terminator factory, staffed by Terminators that build yet more Terminators — seems silly once this simple, proven technique comes into view. Of course, that vision has been refined. Even in 1953, computing pioneer John von Neumann imagined the problem more abstractly: a pile of parts, and machines that used these parts to create clones of themselves. Today we see researchers pursuing 3d printing and self-assembly to realize contemporary, refined versions of such visions.

However, we are naive if we expect the first existential AI threat to use a complex and costly physical process to reproduce. Genius AI would not take the most expensive path. Internet worms provide a cheap and well-proven method for reproduction that happens to provide ample resource for mischief. Once a worm is inside a machine, it can do most anything that machine can do.

Destruction

In terms of social and financial destruction, internet worms have nothing on the coronavirus. A particular strain of ransomware called GandCrab stands out for causing a mere $2 billion in damages. This cost is nothing compared with the trillion-dollar macroeconomic consequences of COVID-19.

Nevertheless, GandCrab stands high above the financial damages caused by leading-edge AI. Excellence in pattern recognition and decision-making does not translate to convincing embezzlement. Any smart AI would likely use one of the proven, dumb techniques to extort, like locking you out of your laptop and demanding a bitcoin ransom.

So far, in terms of physical destruction, internet worms haven’t held a candle to old-fashioned warfare. The Stuxnet worm gained international fame for a rather paltry act of physical destruction, the annihilation of a few centrifuges. But this may change in the future.

As computer-controlled vehicles and weapon systems develop, the physical risks of hostile worm infestations grow more dire. We don’t need to imagine some generalized AI driving around a killer drone. The navigation and targeting solutions — old-school, specific AI — could be triggered by a dumb worm.

Traditionally, viruses have not had much capacity to perform work. Due to network and machine limitations, viruses could not handle advanced computation. But things have changed. Payloads are growing. The Flame worm was noteworthy almost a decade ago for having a large, component-based software architecture that executed a variety of attacks. These days, the more common setup is a distributed botnet. Worms run massive cybermining operations, one of the most power-hungry (and profitable) forms of work a computer can do all by itself.

We don’t have to wait for a genius, general AI to realize that infinite sustenance for replication exists on a botnet. A very dumb AI (like the even dumber coronavirus) could instead merely stumble into such an ample host.

Trending AI Articles:

1. Making a Simple Neural Network

2. Google will beat Apple at its own game with superior AI

3. The AI Job Wars: Episode I

4. Artificial Intelligence Conference

Havoc

All of this potential for destruction, of course, doesn’t mean that armageddon will indeed be unleashed. There is copious debate in the AI community for what forces (if any) might cause a complex artificial general intelligence to do evil. We could spend weeks weighing the merits of various speculations.

However, with internet worms, such projections need not be performed. A history of evil is their wake. Every day, human actors fight to make code to overpower our computational ecosystem. With so much human bad intention that could accidentally get a bit out of control, the independent evolution of machine evil is not needed. We already live in a world where our militaries join independent hackers in the hunt for disproportionate power.

The very forces that drive black hats to write malware could lead them to unleash an unstoppable distributed evil. Advanced intelligence is not necessary. While toiling away at the worm smart enough to create new attack vectors, some smarter hacker will instead write the bot that can scan and deploy threats as they are reported (unpatched vulnerabilities) without even understanding them.

The exploits themselves, as the incremental results of AI research are applied, are getting stronger and stronger. Learning systems have been used to automate spear phishing. Chatbots have been developed to invade social networking sites.

That there are humans working toward this very adaptable form of malware is not a question, but a reality. The powder keg is surrounded by sparks.

The Future Existential Threat

We’re fighting a microscopic, distributed population of pseudo-organisms. All of the arguments about when a machine will become “intelligent” become moot when something far more stupid can cause life as we know it to come crashing to a halt. As the coronavirus has shown, not very much intelligence is needed to tear society apart.

Don’t confuse “malicious” with “intelligent”. Existential threats to humanity are by definition malicious but, like the coronavirus, need not be intelligent.

The lofty threat of an out-of-control artificial general intelligence may indeed exist. But let’s be real, before that threat ever materializes, we will face threats from dumber artificial brains, like the threats we already face.

Our primate brains are built to worry about monsters that we can see, fight or run away from. Scary monsters trigger our primal fears, developed over millennia living exposed in nature.

Our natural intuition does not serve us well in the late Information Age. Counterintuitively, our existential threats are not smart robots but rather dumb and distributed snippets of code.

Don’t forget to give us your ? !


The Impending AI Pandemic was originally published in Becoming Human: Artificial Intelligence Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

Via https://becominghuman.ai/the-impending-ai-pandemic-58d29cab83ab?source=rss—-5e5bef33608a—4

source https://365datascience.weebly.com/the-best-data-science-blog-2020/the-impending-ai-pandemic

Design a site like this with WordPress.com
Get started