Almost everything we do today has an atom of technology in it. From our smart devices to advanced computers, and sophisticated engineering equipment - technology is all around us.
But this question remains - How fast is technology advancing?
Statistics that illustrate how fast technology is growing over the years have shown breakthrough technologies from all aspects of life. Experts predict that there is more to come.
We compiled some of the most groundbreaking stats to enlighten you more on how far technology is progressing. In addition to this, we will shed more light on some of the upcoming trends, sure to leave you stunned!
Fascinating Technology Growth Statistics
The following are some eye-opening stats handpicked from the most reliable sources:
Globally, there are about 1.35 million tech startups around the world.
The number of smart devices collecting, analyzing, and sharing data should hit 50 billion by 2030.
The Internet adoption rate sits at 59% in 2021.
The computing and processing capacity of computers hits double figures every 18 months.
The world has produced 90% of its Big Data in the past two years.
Every second, 127 new devices are connected to the internet.
In Q1 of 2021, 4.66 billionpeople are using the internet.
Sounds amazing right? That’s just the tip of the iceberg - we have more in store for you. Read on to find out!
General Technology Growth Statistics
The following are some generalized statistics about how the growth of technology is influencing every sector.
1. The internet penetration rate in the world is at 59% as of January 2021.
(Source: Data Reportal)
In the last few decades, there has been a growing telecommunications implementation. This has led to an ongoing internet usage rise.
According to technology adoption statistics, the rate stands at almost 60% as of January 2021. Compared to Q1 of 2020, the rate has gone up by 7%.
2. $183.18 billion - that is how much the web hosting services marketplace is expected to have generated by 2026.
(Source: Fortune Business Insight)
In 2017, the global web hosting market had a value of $32.12 billion, and in 2018 that figure rose to $60.90 billion. By Maintaining a Compound Annual Growth Rate (CAGR) of 15.1%, experts predict that the web hosting industry will be worth more than $100 billion in a few years as a result of global tech market growth. That's why there's such a fierce battle between the best hosting providers on the market.
3. There are 4.88 billion phone users in the world as of January 2021.
(Source: Bank My Cell)
According to technology growth statistics, 62% of the world’s population owns a mobile phone. Compared to 2020, the number of phone owners has gone up by 0.1 billion.
That includes both smart and feature phones.
Breaking down the number even further:
Smartphone owners are the majority here, amounting to 3.8 billion. On the other hand, feature phone owners are 1.08 billion.
4. By 2025, there will be 75 billion connected devices in the world.
(Source: Statista, MTA)
In 2025, the number of Internet of Things (IoT) will be thrice that of 2019. Think slow cookers, wearable technology like smartwatches, smart meters, smartphones, etc.
The technology has become so popular that industry experts predict that every consumer will own about 15 IoT devices by 2030!
When it comes to how fast technology is growing statistics, Google got more than two trillion searches in 2020 alone.
Let’s have a closer look at the numbers:
There were 81,000 searches every second in 2020. That translates to about seven billion probes per month.
6. The need to reach new customers is the primary factor pioneering technological growth in the last few years (46%).
(Source: Finance Online)
Technology adoption statistics reveal that factors such as selling new business lines (38%), overall improvement of business operations (41%), improving sales and marketing (35%), improving standard internal processes (33%), are the main drivers for tech growth.
7. By 2040, 95% of purchases will be online.
Buying over the internet is so convenient because you can get whatever you need from the comfort of your home regardless of time or location. According to technology growth stats, ecommerce will have grown so much that buyers will conduct almost all of their purchases online in the next two decades.
8. There are 3.96 billion social media users in the globe as of 2021.
Social media allows people to connect regardless of their geographical location and at negligible costs if you have to buy data bundles.
Lucky for you if you’re using your office or public wifi.
You’ll only have to part with $0.
According to technology statistics, for 2021, there are almost four billion social media users globally. That’s almost twice the number in 2015.
AI and Machine Learning Statistics
Technology stats and facts show that AI remains one of the most sought-after technological advancements pioneering technological growth around the world. Read on to find out some amazing stats on how AI and machine learning are impacting society.
9. Google Translate algorithm has increased its efficiency from 55% to 85% following the implementation of machine learning into its translation services.
(Source: Finance Online)
Google Translate is a service developed by Google to help customers translate text and websites to any desired language. Before the introduction of AI, it would typically take more time to translate a series of words in a foreign language, as the process is done one text at a time. However, with the application of deep learning (a sub-function of AI), the Google Translate service is now able to interpret a whole sentence or website at once.
10. The global machine learning market is expected to reach $20.83 billion in 2024.
(Source: Finance Online, Forbes)
Tech growth stats indicate that machine learning is currently one of the most popular and most successful sub-functions of AI.
It should come as no surprise that the market is growing in value. Worth around $1.58B in 2017, it is expected to reach $20.83B in 2024, growing at a CAGR of 44.06%.
11. The Compound Annual Growth Rate (CAGR) for AI will be 42.2% by 2027.
(Source: Grand View Research)
Stats on how fast technology is advancing reveal that the artificial intelligence market was worth $10.1 billion in 2018. In 2019, that value increased to $39.9.
As you can see, there has been positive growth over the years, which is likely to continue.
Giant tech firms have been pouring big bucks into research and development, the reason why technology keeps advancing every day. Examples of big names investing heavily in this sector include Facebook, Amazon, Microsoft, Google, and Apple.
Industry players predict a CAGR of 42% between the period 2020 to 2027.
12. AI will replace around 85 million jobs in the US by 2025.
Does automation benefit the ordinary citizen?
You be the judge.
The pandemic led to massive job losses, leaving one in every four adults in serious financial difficulty. They had issues footing their bills.
That led to 33% of Americans using their life savings to cater for their expenses. Others had to borrow loans and now have huge debts.
And it looks like the labor market hasn't seen anything yet.
AI statistics show that its adoption will lead to job losses to the tune of 85 million by the end of 2025.
However, it's not all bad:
Experts predict that there will be 95 million job openings because of artificial intelligence. By 2025, humans and machines will strike a balance of 50-50 when it comes to working.
13. Worldwide, only 37% of organizations have incorporated AI into their business.
Although the figure may not be high enough, it is still a significant rise from what we had in 2015 (about 270% increase).
14. Artificial General Intelligence (AGI) has a 50% chance of rising to 90% by 2075.
AGI mimics human intellect. Think cooking or styling hair with precision.
Experts predict that there are high chances that in most work environments in 2075, nine out of 10 companies will use AGI technology.
15. IT hiring was 7% lower than usual in Q3 of 2020.
IT job posting between August and September 2020 was virtually nonexistent. However, experts forecasted that it was only a small hitch that would go away in the coming months.
Technology adoption stats show that 68% of large organizations created more positions than they had in the second quarter of 2020. Therefore, it appears that smaller firms were struggling and didn't have hiring budgets.
16. The fully and semi-automatic car market will be worth $26 billion by 2030.
(Source: Electronic Design)
Experts estimate that the number of connected cars in Europe, China, and the US will be about 470 million by 2025. Technology statistics show that the vehicles will generate data worth $750 billion.
While that sounds impressive, we should think about the security aspect. The information that the technology will derive could land in hackers’ hands instead of genuine parties like manufacturers or vendors.
It will be paramount for developers to come up with top-of-the-range security programs to keep cybercriminals at bay.
17. 71% of executives believe that artificial intelligence and machine learning are game-changers for businesses.
(Source: AMC Laboratories)
The world is beginning to wake up to the fact that robotics and automation powered by AI could be the future of work. However, some are more prepared than others. Those that fail to prepare may be left behind when the changes start to take effect.
18. 16% of companies in Europe believe that AI would help them counter the adverse effects of COVID-19 on labor.
Emerging technologies and automation will be at the forefront of cushioning businesses from the effects of the pandemic. Nearly 20% of organizations say that AI will be the only solution to the current shortage of workers.
19. AI is the most significant portion of the data strategy of any business, according to 61% of marketers.
(Source: Finance Online)
Data strategy is a set of informed decisions taken from a position of insight (after careful study of available data) on how best to move a business forward. It is the job of AI to study the set of available data, and help to draw insights as to existing flaws and what needs to improve.
20. The AI market will be worth over $15 trillion by 2030.
AI technology is progressing, and the industry is growing pretty fast. Businesses and individuals alike love its efficiency. Logically, demand will continue to rise in the coming years. By 2030, its value will be $15+ trillion. More than ¾ of emerging technologies already planned to own foundations as early as 2021.
Big Data Statistics
Data that has become so large and complicated for the traditional computer system to make sense of is referred to as Big Data. However, Big Data impact statistics have shown that it can become a goldmine to whoever understands its capacity. Check out the statistics that follow to discover the impact of Big Data on technology and internet growth.
21. Organizations that are data-driven are 23x more likely to acquire new leads than those without a data-driven strategy.
Big Data can be a source of insight for those that care to put in the work, understand patterns in its data, and relate them to their various businesses. Facebook is an exemplary example of a company that is effectively utilizing both Big Data and AI to understand its audience better.
22. 91.6% of Fortune 1000 companies are investing more in Big Data and AI.
(Source: ZD Net)
Big Data is like the new gold for businesses. Coupled with AI, a good deal of information can be extracted from both structured and unstructured data. The Fortune 1000 companies know this. The most successful entrepreneurs also know this. For this reason, technology growth statistics tell us that these companies always have a specified budget put in place for data analytics.
23. Two-thirds of organizations that have utilized Big Data effectively have reportedly seen a decrease in operational expenses.
Big Data impact statistics reveal that for businesses that can do away with the junk of unuseful data, Big Data can provide direct and specific information about what works for such businesses and what does not. That way, these businesses can avoid the trouble of wasting time, effort, and resources on strategies that don't give results. Instead, it enables them to focus all of those energy and resources on what works.
24. We generate 2.5 quintillion bytes of data daily.
According to tech growth statistics, we now produce data in trillions and quintillions daily. This number has been on the rise over the last few years, meaning that we should expect to produce more in the next 2-3 years.
25. Analytics and big data will bring in an income of $274 billion by 2024.
Revenue from big data and analytics has been on an upward rise over the last few years. By Q4 of 2021, data center Internet Protocol (IP) traffic reached 19.5 zettabytes. Business Intelligence (BI) analytics will be worth $14.5 billion in 2022.
26. By increasing their effectiveness at utilizing Big Data, Fortune 1,000 companies can increase their net income by up to $65 million.
According to Big Data impact statistics, the ability to extract, understand, and utilize Big Data has a direct impact on both sales and revenue. With Big Data, businesses can better understand their customers, thereby channeling their efforts towards what works and increasing conversion rates.
27. 71% of companies find it difficult to protect and manage unstructured data.
As enticing as the idea of Big Data analytics may seem, it still requires a lot of technical and specialized kind of skillset to make sense of the large chunk of available data. Thankfully, the best data visualization software can transform huge amounts of raw data into easy-to-digest visuals. These can provide decision-makers with valuable insights quickly and easily.
28. 83% of organizations worldwide are currently investing in various Big Data projects.
Given how rapidly technology is growing, and the millions of data being generated daily, top company executives are beginning to realize the usefulness of Big Data. Some even argue that failure to invest in Big Data for any business is like walking your way towards bankruptcy.
Mobile Technology Statistics
Internet traffic growth statistics tell that over the last decade, mobile usage has been on the rise, even surpassing desktop web traffic for the very first time in late 2015. Almost anything can now be achieved on mobile. Check out some of these mobile tech statistics to discover how vital technological advancements on mobile have become.
29. Over half the traffic comes from mobile phones as of Q1 Of 2021.
If you’re wondering how fast technology is growing - statistics for 2022 show phone traffic has increased by 49.47% since 2011.
Back then, 93% of visitors came from the desktop, while mobile brought in a meager 6%.
However, the two went head to head in 2016, when the difference was about 1%, i.e., 48.25% for mobile and 46.93% via computers. As of 2021, cell phone browsing had surpassed that of other internet-connected devices.
30. 91% of internet users in 2020 were mobile phone owners.
According to internet traffic growth statistics, more than half the world population was actively browsing the internet as of 2020. That translates to around four billion people. It is indeed a global village with billions of people who might have otherwise never met connecting.
31. People check their mobile phones about 150 times daily.
(Source: Business Services Week)
Call it an addiction!
The fact remains that mobile phones have become a massive part of how most of us function daily. We check our phones almost all the time for messages, notifications, time, etc.
32. Total mobile connections in 2021 amount to 10.24 billion.
(Source: Bank My Cell)
So, how's that, yet ownership is only about four billion?
Isn’t the world population 7.84 billion?
Well, there are people with dual SIM cards. Then there are those with more than one device, not forgetting integrated devices like security systems or cars.
33. 80% of smartphone users make use of their phones during physical shopping.
(Source: Business Services Week)
This could be either to read up reviews of a particular product that they are about to purchase or to locate an alternative store where they can compare products and prices. Either way, this goes to show the impact of technology on how we live our daily lives. Smart business owners who understand this fact can begin to make adjustments towards mobile to boost their traffic and improve conversion rates.
34. 95.1% of the Facebook audience access the platform through their mobile phone.
(Source: Business Services Week)
Platforms like Google and LinkedIn have already implemented a mobile-first standard for their websites, and the reason for such a move is not far-fetched. Internet access growth statistics reveal that mobile drives the majority of the traffic on Google and other social media platforms.
35. Google Play and the Apple App Store have a combined 4.4 million mobile apps for download in 2020.
90% of the mobile apps on Google Play and the Apple App Store are free to download. Notwithstanding, more businesses are beginning to understand the power of mobile apps and the amount of time consumers spend on various apps daily. Technology adoption stats show that mobile apps can help develop a brand image and improve customer loyalty when done right.
36. About 56% of parents who have kids aged between 8 and 12 years have purchased mobile phones for them.
(Source: NCL Net)
Statistics about how fast technology is growing show that kids are growing up in a technologically advanced society. Back in the '80s and '90s, who would have ever thought that a 12-year-old would own a mobile phone, let alone an 8-year-old. However, that is the reality of today's economy.
That's why solutions like parental control software are becoming more and more popular.
37. 98% of Generation Z have a mobile phone.
(Source: Global Web Index)
Technology advancement rate statistics go further to reveal that Generation X has a mobile penetration of 92%, Generation Z with 52%, and Baby Boomers with 42%.
38. Mobile advertising will reach $247 billion by 2022.
Technology adoption stats show that advertising through mobile will reach almost $300 billion by the end of 2022. That will be a $244 billion increase from 2011 figures.
39. There were 490 million new social media users in 2020.
Source: (Data Reportal)
Social media is getting more and more engaging by the day. The number of new users almost hit the five hundred million mark in 2020.
40. Increasing usage of mobile banking technologies could generate up to 95 million jobs.
A McKinsey Global Institute study found that over 80% of adults in developing countries owned a mobile device. However, only 55% had a bank account.
Mass adoption of mobile banking technologies has the potential to empower people financially. For one, technology adoption stats show that it can generate up to 95 million jobs and even increase GDP by a whopping $3.7 trillion by 2025.
Internet of Things (IoT) Statistics
Over the last few years, the concept of IoT has become a vital role player across various industries. More and more businesses now look to integrate its many benefits into their network infrastructures. The following are some of the most up-to-date statistics on the growth of IoT.
41. Every second, 127 new devices are connected to the internet.
With the availability of affordable computer chips (sensors) through nanotechnology and the ubiquity of wireless networks, almost anything can now be made a part of the IoT according to statistics about how fast technology is advancing.
42. There are 4.66 billion internet users as of the first quarter of 2021.
(Source: Data Reportal)
Just how fast is technology advancing in 2021?
The first few months of 2021 show that 4.7 billion people are using the internet. That’s almost ¾ of people in the world, looking at it from a global perspective!
43. North America had the highest internet penetration rate globally in 2020.
(Source: Internet World Stats)
In December 2020, North America's internet access was the highest globally, at almost 90%. Europe was second with 87%, while Latin America took the third position with 72%.
Although Africa had the lowest internet penetration rates globally, it has made some significant advancements in the last few years. Its progression in the area was pretty fast that same year.
Let’s look at the numbers:
According to internet growth stats, Africa had the highest rate at 13,941%. The Middle East followed with 5,528% and finally Latin America with 2,545%.
44. Cellular IoT connections could reach 3.5 billion by 2023.
Cellular IoT connection is a feature that allows sensors to be able to transfer information directly to a computer or your mobile device within a region or specified distance. Health wearables that transfer the information about the state of health of a patient to a doctor or hospital is an excellent example of cellular IoT.
45. 75.44 billion IoT devices could be in existence by 2025.
IoT statistics reveal that there were over 25 billion IoT devices around the globe at the end of 2019. Statista predicts that there could be well over 50 billion by 2023.
46. 70% of all automobiles will be connected to the internet through the Internet of Things by 2023.
Technology statistics and findings show that the automobile industry is one of the few places where innovations in the IoT have seen significant improvements in the past few years. Aside from developing self-driving cars, research is being made to add lots of other features to the automobile industry through the Internet of Things. Soon, we could have vehicles that detect bad driving, accidents, and possibly imminent collision. In addition, cars that detect flaws in design while sending a report back to the manufacturer could also be a norm in years to come.
47. The IoT could generate up to $11 trillion in economic value per year by 2025.
Statistics that illustrate how fast technology is growing show that the global usefulness and availability of the Internet of Things is increasing at a breakneck pace. The IoT can save costs, increase productivity, create employment, and bring in billions and trillions in economic value in the process.
48. Around 44% of businesses use IoT to reduce costs.
IoT statistics show that more companies are pursuing smarter systems due to the technological growth in that sector.
As of 2021, about 44% of businesses use IoT devices to reduce costs. 37% of them use it to enhance operational processes, and 30% use them to grow revenue.
Interestingly, major tech websites are beginning to follow these technology trends and have started implementing similar systems.
Global Tech Market Growth Statistics
Technology touches our whole lives and has generated trillions of dollars in revenue and market size in the process. Discussed below are some incredible milestones to help you better understand the impact of technology on businesses around the world.
49. Worldwide spending on IT will amount to $3 trillion by 2021.
Predictions show that consumers will spend upwards of $3 trillion by 2021. 33% of this budget will go to hardware, while the rest will be for apps and related software. It will be a positive growth from 2020 data which showed a slow down due to Corona when most businesses aimed at cutting costs.
50. By 2025, the wearable AI market is going to be worth $180 billion.
AI statistics reveal that as of 2018, the wearable AI market was already worth $35 billion. Growing at a CAGR of 30%, that figure is expected to surpass the $100 million mark by 2025.
51. Big Data could attain a market size of $77 billion by 2023.
Big Data volume statistics have shown that I cannot overemphasize its importance. This follows as hundreds of organizations around the world are already investing directly and indirectly into its befitting features. Insights from Big Data analytics can pioneer a small startup into becoming a multinational organization within the shortest possible time.
52. Income from AI hardware will be worth $234.6 billion in 2025.
Products in this category include storage devices, network products for Graphics Processing Units (GPU), and Central Processing Units (CPU). Forecasts show that in 2025, their market value will have surpassed that of 2018’s by around $22 billion.
53. Successful companies like Netflix have been able to save up to $1 billion monthly following the adoption of a machine learning algorithm.
(Source: Finance Online)
Netflix’s AI algorithm can accurately recommend which movies will get the attention of each user based on their interaction with the website. That way, user engagement is significantly increased, and the cancellation rate reduced, thus increasing the potential of having a user around for a more extended period. Without a doubt, Netflix's machine learning algorithm is one of the essential elements that make it one of the best streaming services out there.
54. Up to $657.31 billion would have been invested into the IoT by 2025.
(Source: Analitics Insight)
As of the end of 2019, the IoT market was already worth 193.60 billion. It could grow even further with a CAGR of 21% yearly if the technology growth rate is anything to go by.
Internet Growth Statistics
Initially designed only to interconnect government-owned research laboratories, the internet has expanded at an exponential rate over the last three decades
55. Internet users around the world spend an average of 6 hours, 42 minutes online daily.
(Source: Digital Information World)
The most recent data presented by statistics that illustrate how fast technology is growing places the average time spent online at above six hours. Countries like the Philippines and Brazil have the highest amount of time spent online daily, with 10:02 hours and 9:20 hours, respectively. The US falls a little short of the global average, clocking in at 6:31 hours daily internet time. Others like Japan and France spend the least amount of time online daily with 3:45 hours and 4:38 hours, respectively.
56. The median social media usage between 2019 and 2020 was 143 minutes daily.
How fast technology is growing statistics show that two hours and 23 minutes is the amount of time that social media users spent on their favorite networking sites in 2020. When it comes to the country whose citizens spend the highest amount of time on the sites, the Philippines came first with about 3 hours.
57. Over 4.54 billion people are active internet users out of the 7.76 billion people in the world.
According to statistics that illustrate internet growth, the internet is growing at a pace of 11 new users per second - that is about 1 million unique users daily. Between the fourth quarter of 2018 and that of 2019, 366 million new users were added to the total number of internet users, bringing the final figure for 2019 to 4.39 billion users. However, between the end of 2019 and the first quarter of 2020, that figure has risen to more than 4.54 billion.
58. There were 1.83 billion websites in January 2021.
(Source: Web Hosting Rating)
Websites began getting popular in 2012. That year alone, businesses and individuals alike launched about one billion websites. Growth of the internet statistics indicates an upward trend, and the number has increased by approximately 800 million as of 2021.
As of 2020, there were 20 million domain registrations. That was close to a 5% increment from the last quarter of 2019.
59. 63% of 2021 internet surfers prefer Chrome.
As of 2021, six out of every 10 people visiting the internet do so via Chrome. Safari, the second most popular browser, doesn’t even come close. It only has about 19% of regular users. Mozilla Firefox and Samsung Internet tie at number three, with 3.61%.
60. The global ecommerce market is set to hit $6.54 trillion by 2022.
As of the end of 2019, the ecommerce market already had $4.2 trillion in sales. That number is expected to grow even further given that ecommerce is becoming the most preferred form of buying and selling around the globe.
61. More than 92% of internet users now consume video content online monthly.
(Source: Data Reportal)
Online video platforms like YouTube get massive traffic on a per-second basis every day. According to statistics, up to 500 hours of video is uploaded to YouTube every minute. Also, the platform has up to 1.9 billion users.
62. The number one YouTube channel had 51.36 billion views.
Like Nastya’s - Anastasia Radzinskaya, has slightly over 51 billion lifetime views in January 2021. Technology facts show that users find videos more entertaining and memorable. So keep that in mind, marketers!
Future Trends in Technology Growth
When it comes to future technological trends that will rule the world shortly, a few inventions come to mind. They include Blockchain, cloud computing, AR/VR, robotics, and automation. The stats that follow will expose you to some of these future technology trends.
63. There will be 8.4 billion voice assistants by 2024.
In 2020, there were about 4 million virtual assistants. That number will double by 2024 and will be close to 8.5 billion units. The world’s human population is 7 billion, so let’s hope it’s not the dawn of the Matrix.
64. By 2025, 500 million Virtual Reality headsets would have been sold.
An increasing smartphone adoption rate, the automobile industry, military and law enforcement training, the gaming industry, and growing technology awareness are some of the significant factors influencing the increasing need for VR headsets.
65. 94% of the internet workload will be processed on the cloud by the end of 2021.
(Source: Network World)
Since its introduction to the mainstream market, the cloud computing trend has shown massive year-over-year growth. Experts believe that it will soon cause traditional data centers to go obsolete. As of 2018, the cloud was already housing 45% of the internet workload, and that number will rise even further in a few years.
66. The Blockchain technology industry’s revenue is predicted to hit $20 billion by 2024.
Experts predict that as time goes on, blockchain will find usefulness across multiple industries due to its secure and sophisticated network. Currently, there are ongoing successful experiments to combine Blockchain and Big Data to ensure uniformity and accuracy of results, especially in the insurance sector, and many more will follow in the years to come.
How Is Technology Affecting Our Lives?
There is no doubt about how vital technology has become to how we live our lives each day on earth. The technology process has made life both more comfortable and efficient for the average human. The following stats will expound more on how technology is influencing our lives in general:
67. Technology has made communication easier.
(Source: Thrive Global)
The younger generations won't remember the days when there were no mobile phones. Today, anyone can pick up the phone and place a phone call to loved ones, irrespective of their location in the world. Plus, the coming of the internet and social media has made staying connected even cheaper.
68. Technology has improved advertising.
(Source: Thrive Global)
Billboards are becoming outdated, and door-to-door advertising is said to have gone extinct. With the internet, businesses can now reach their targeted audience with ease and still obtain better conversion rates than the old system of advertising.
69. Learning is now more efficient and more comfortable to carry out with technology.
(Source: Thrive Global)
In the past, you had to scourge the library for books on specific subjects that you intend to study. Today, there are billions of videos, podcasts, audio, and text over the internet on almost anything you wish to study, making education more accessible.
Technology has sure come a long way! There are billions of inventions yet to be discovered by the upcoming generations, and many more after them.
So, if you’ve ever wondered how fast is technology growing, statistics answer - lightning fast. And it is showing no signs of slowing down.
Can you imagine what life would be without technology?
The Internet of Things (IoT) describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the internet. These devices range from ordinary household objects to sophisticated industrial tools. With more than 7 billion connected IoT devices today, experts are expecting this number to grow to 10 billion by 2020 and 22 billion by 2025.
Over the past few years, IoT has become one of the most important technologies of the 21st century. Now that we can connect everyday objects—kitchen appliances, cars, thermostats, baby monitors—to the internet via embedded devices, seamless communication is possible between people, processes, and things.
By means of low-cost computing, the cloud, big data, analytics, and mobile technologies, physical things can share and collect data with minimal human intervention. In this hyperconnected world, digital systems can record, monitor, and adjust each interaction between connected things. The physical world meets the digital world—and they cooperate.
What technologies have made IoT possible?
While the idea of IoT has been in existence for a long time, a collection of recent advances in a number of different technologies has made it practical.
Access to low-cost, low-power sensor technology. Affordable and reliable sensors are making IoT technology possible for more manufacturers.
Connectivity. A host of network protocols for the internet has made it easy to connect sensors to the cloud and to other “things” for efficient data transfer.
Cloud computing platforms. The increase in the availability of cloud platforms enables both businesses and consumers to access the infrastructure they need to scale up without actually having to manage it all.
Machine learning and analytics. With advances in machine learning and analytics, along with access to varied and vast amounts of data stored in the cloud, businesses can gather insights faster and more easily. The emergence of these allied technologies continues to push the boundaries of IoT and the data produced by IoT also feeds these technologies.
Conversational artificial intelligence (AI). Advances in neural networks have brought natural-language processing (NLP) to IoT devices (such as digital personal assistants Alexa, Cortana, and Siri) and made them appealing, affordable, and viable for home use.
What is industrial IoT?
Industrial IoT (IIoT) refers to the application of IoT technology in industrial settings, especially with respect to instrumentation and control of sensors and devices that engage cloud technologies. Refer to thisTitan use case PDF for a good example of IIoT. Recently, industries have used machine-to-machine communication (M2M) to achieve wireless automation and control. But with the emergence of cloud and allied technologies (such as analytics and machine learning), industries can achieve a new automation layer and with it create new revenue and business models. IIoT is sometimes called the fourth wave of the industrial revolution, or Industry 4.0. The following are some common uses for IIoT:
Connected assets and preventive and predictive maintenance
Smart power grids
Smart digital supply chains
What are IoT applications?
Business-ready, SaaS IoT Applications
IoT Intelligent Applications are prebuilt software-as-a-service (SaaS) applications that can analyze and present captured IoT sensor data to business users via dashboards.
IoT applications use machine learning algorithms to analyze massive amounts of connected sensor data in the cloud. Using real-time IoT dashboards and alerts, you gain visibility into key performance indicators, statistics for mean time between failures, and other information. Machine learning–based algorithms can identify equipment anomalies and send alerts to users and even trigger automated fixes or proactive counter measures.
With cloud-based IoT applications, business users can quickly enhance existing processes for supply chains, customer service, human resources, and financial services. There’s no need to recreate entire business processes.
What are some ways IoT applications are deployed?
The ability of IoT to provide sensor information as well as enable device-to-device communication is driving a broad set of applications. The following are some of the most popular applications and what they do.
Create new efficiencies in manufacturing through machine monitoring and product-quality monitoring.
Machines can be continuously monitored and analyzed to make sure they are performing within required tolerances. Products can also be monitored in real time to identify and address quality defects.
Improve the tracking and “ring-fencing” of physical assets.
Tracking enables businesses to quickly determine asset location. Ring-fencing allows them to make sure that high-value assets are protected from theft and removal.
Use wearables to monitor human health analytics and environmental conditions.
IoT wearables enable people to better understand their own health and allow physicians to remotely monitor patients. This technology also enables companies to track the health and safety of their employees, which is especially useful for workers employed in hazardous conditions.
Drive efficiencies and new possibilities in existing processes.
One example of this is the use of IoT to increase efficiency and safety in connected logistics for fleet management. Companies can use IoT fleet monitoring to direct trucks, in real time, to improve efficiency.
Enable business process changes.
An example of this is the use of IoT devices for connected assets to monitor the health of remote machines and trigger service calls for preventive maintenance. The ability to remotely monitor machines is also enabling new product-as-a-service business models, where customers no longer need to buy a product but instead pay for its usage.
What industries can benefit from IoT?
Organizations best suited for IoT are those that would benefit from using sensor devices in their business processes.
Manufacturers can gain a competitive advantage by using production-line monitoring to enable proactive maintenance on equipment when sensors detect an impending failure. Sensors can actually measure when production output is compromised. With the help of sensor alerts, manufacturers can quickly check equipment for accuracy or remove it from production until it is repaired. This allows companies to reduce operating costs, get better uptime, and improve asset performance management.
The automotive industry stands to realize significant advantages from the use of IoT applications. In addition to the benefits of applying IoT to production lines, sensors can detect impending equipment failure in vehicles already on the road and can alert the driver with details and recommendations. Thanks to aggregated information gathered by IoT-based applications, automotive manufacturers and suppliers can learn more about how to keep cars running and car owners informed.
Transportation and Logistics
Transportation and logistical systems benefit from a variety of IoT applications. Fleets of cars, trucks, ships, and trains that carry inventory can be rerouted based on weather conditions, vehicle availability, or driver availability, thanks to IoT sensor data. The inventory itself could also be equipped with sensors for track-and-trace and temperature-control monitoring. The food and beverage, flower, and pharmaceutical industries often carry temperature-sensitive inventory that would benefit greatly from IoT monitoring applications that send alerts when temperatures rise or fall to a level that threatens the product.
IoT applications allow retail companies to manage inventory, improve customer experience, optimize supply chain, and reduce operational costs. For example, smart shelves fitted with weight sensors can collect RFID-based information and send the data to the IoT platform to automatically monitor inventory and trigger alerts if items are running low. Beacons can push targeted offers and promotions to customers to provide an engaging experience.
The benefits of IoT in the public sector and other service-related environments are similarly wide-ranging. For example, government-owned utilities can use IoT-based applications to notify their users of mass outages and even of smaller interruptions of water, power, or sewer services. IoT applications can collect data concerning the scope of an outage and deploy resources to help utilities recover from outages with greater speed.
IoT asset monitoring provides multiple benefits to the healthcare industry. Doctors, nurses, and orderlies often need to know the exact location of patient-assistance assets such as wheelchairs. When a hospital’s wheelchairs are equipped with IoT sensors, they can be tracked from the IoT asset-monitoring application so that anyone looking for one can quickly find the nearest available wheelchair. Many hospital assets can be tracked this way to ensure proper usage as well as financial accounting for the physical assets in each department.
General Safety Across All Industries
In addition to tracking physical assets, IoT can be used to improve worker safety. Employees in hazardous environments such as mines, oil and gas fields, and chemical and power plants, for example, need to know about the occurrence of a hazardous event that might affect them. When they are connected to IoT sensor–based applications, they can be notified of accidents or rescued from them as swiftly as possible. IoT applications are also used for wearables that can monitor human health and environmental conditions. Not only do these types of applications help people better understand their own health, they also permit physicians to monitor patients remotely.
How is IoT changing the world? Take a look at connected cars.
IoT is reinventing the automobile by enabling connected cars. With IoT, car owners can operate their cars remotely—by, for example, preheating the car before the driver gets in it or by remotely summoning a car by phone. Given IoT’s ability to enable device-to-device communication, cars will even be able to book their own service appointments when warranted.
The connected car allows car manufacturers or dealers to turn the car ownership model on its head. Previously, manufacturers have had an arms-length relationship with individual buyers (or none at all). Essentially, the manufacturer’s relationship with the car ended once it was sent to the dealer. With connected cars, automobile makers or dealers can have a continuous relationship with their customers. Instead of selling cars, they can charge drivers usage fees, offering a “transportation-as-a-service” using autonomous cars. IoT allows manufacturers to upgrade their cars continuously with new software, a sea-change difference from the traditional model of car ownership in which vehicles immediately depreciate in performance and value.
The convergence of artificial intelligence, blockchain, cloud computing, edge computing, Internet of Things (IoT), 5G, computer vision and augmented/virtual reality is taking society on a journey through the next wave of the digital revolution and toward the metaverse.
As one of the key enablers of the metaverse, IoT has reshaped our lives in significant ways with a myriad of applications, including smart homes, smart manufacturing, smart healthcare and intelligent transportation systems. Billions of connected devices have generated massive amounts of data that tech giants analyzed to extract valuable insight for their businesses.
However, the IoT industry presently possesses several limitations that restrict the sustainable growth of IoT ecosystems. Can blockchain and cryptocurrency help tackle industry-wide challenges and take IoT to the next level?
Internet of Things: The Status Quo
Today, a typical IoT application is still primarily centralized. An IoT company distributes smart devices to its customers and builds the entire solution that often includes various components.
These include identity management, device management, connectivity gateway, data storage, digital twin, data visualization and others, all on a preferred cloud platform. Centralized IoT system architecture was developed to deliver incredible value to customers, but it comes with five key disadvantages:
• Single point of failure: An IoT solution deployed as a centralized solution is subject to a single point of failure. Although cloud service providers have made efforts to improve the scalability, reliability and availability of their platforms, cloud platforms still experience service outages from time to time, leaving customers with smart devices in the lurch.
• Ownership of devices and data: Users who purchase IoT devices do not truly own their devices or data that’s collected. The lifecycle of smart devices is often fully managed by IoT companies, and it is quite difficult, if not impossible, for users to repurpose their devices for other applications. Moreover, IoT companies have extensively used data collected by smart devices, creating new value in businesses without compensating their customers.
• Application and data silos: Most IoT solutions deployed on a centralized platform are self-contained, thereby forming application and data silos. Those silos hinder the value exchange between different IoT systems and result in the loss of new business opportunities.
• Misalignment of values: IoT ecosystems consist of multiple stakeholders, such as device manufacturers/OEMs, network operators, platform providers, service providers, end-users, etc. The centralized IoT architecture enables platform and service providers to maximize their shares of the value chain revenue, whereas the profit margins for device manufacturers/OEMs are quite slim. In addition, end-users are excluded from the centralized IoT value chain.
• Barriers to innovation: The application and data silos, coupled with rigid business models, create barriers to continuous innovation in IoT. It slows down technology adoption and ecosystem growth.
Internet of Things: A New Dawn
The introduction of blockchain and cryptocurrency has shed light on a new community-driven machine economy called MachineFi. The innovative combination of blockchain, cryptocurrency and IoT provides effective solutions to address challenges that the IoT industry is facing.
• High availability and security: The decentralized nature of blockchain implies that applications running on top of it can achieve high availability and security. As a result, IoT companies could leverage blockchain to deploy the critical components of their solutions, thereby reducing service downtime and enhancing system trustworthiness.
• User-owned device and data: By applying the emerging concepts such as decentralized identifiers (DIDs) and verifiable credentials (VCs) to build a self-sovereign identity metasystem for people, organizations and IoT devices, users can gain control over data collected by their smart devices and decide how it’s shared.
• Interoperable applications and data: Using blockchain as the underlying fabric can connect different IoT applications, enabling them to exchange digital assets in a transparent and trustworthy manner. In particular, large-scale decentralized and autonomous IoT applications can be built upon individual applications by leveraging interoperable data.
• Fair distribution of values: Cryptocurrencies and associated token economy models provide powerful tools for incentivizing all the stakeholders in IoT ecosystems. Aligning the stakeholders’ benefits in a fair and consistent manner is the main force for transforming the IoT industry and forming the flywheel effect.
• Endless innovation: By combining IoT with the value exchange layer powered by DeFi, NFTs and DAOs, the IoT industry is able to create new business models and build a wide range of community-driven, machine-centric applications. Such digital transformation will bring endless innovation opportunities to IoT ecosystems.
MachineFi represents a paradigm shift in the way IoT systems are designed and monetized. It considers all of the stakeholders in an ecosystem and incentivizes them to move the value-creation flywheel continuously. The exciting future of an IoT for all of us is no longer a dream. It's a reality.
In 2021, the global IoT market has reached $17.5 billion of total value.
Consumers are rapidly using Smart Home technology, which accounts for 97% of global sales. Smart House technologies have the fastest growth compared to other categories, fueled by home upgrades during the COVID-19 epidemic.
The Internet of Things industry is diverse in terms of both application and brand. The IoT market is shaped by pure players (like PTC) as well as huge consolidated organizations with diverse goods and services.
The introduction of 5G communication standard is a game changer for speedy connectivity between IoT devices. Using edge computing instead of cloud computing speeds up procedures by collecting and analyzing data at the IoT device level.
Automation utilizing IoT also benefits industrial use cases, resulting in a new trend: the Industrial Internet of Things (IIoT). But as IoT use grows, so do cybersecurity dangers, as these devices become targets for hackers.
Blockchain has proven to have a significant impact on the Internet of Things by increasing safety and enabling the integration of more devices. The improvements in IoT device security speed up the adoption of this breakthrough invention and bring up new opportunities for businesses.
As of today, few IoT systems utilize the blockchain to transfer data. Blockchain technology allows for immutable and decentralized data transfer and both IoT and blockchain need conscious and non-intentional risk management.
For these reasons, blockchain technology can address several of the IoT cybersecurity needs, including integrity, secure communication, and resilience: it might bring additional security qualities like availability and accessibility to a secure micropayment system.
The ideal blockchain implementation in the IoT space must have no or minimal transaction costs, significant growth potential, and a scalable identity management procedure.
However, traditional blockchain does not address all IoT security concerns: personal data confidentiality and protection need additional encryption.
That’s where IoTeX stands out.
IoTeX was founded in 2017 by Raullen Chai, Qevan Guo, and Jing Sun. and deployed in February 2018.
IoTeX is a full-stack platform that enables trustworthy data from trusteddevices to be used in trusted DApps.
It employs permissioned or permissionless blockchains, enhancing privacy with quick consensus and immediate finality.
IoTeX believes no one blockchain solution can meet all IoT needs. For this reason, they established specific platforms that will communicate with defined IoT devices, following the idea of separation of tasks.
Indeed, the specified level of IoT structures can only be managed by a certain level of blockchain complexity.
The IoTeX platform is composed of many technology layers:
Roll-DPoS consensus with more than 60 decentralized delegates
Secure Hardware: tamper-proof devices using Trusted Execution Environment (TEE) that work flawlessly with IoTeX
Real-World Data Oracles: turn real-world events into verified data for IoTeX DApps
Decentralized identity framework that allows users/devices to control their data and credentials
IoTeX Rootchain and Subchains — Fast Consensus with Instant Finality
IoTeX has a public permissionless root chain as well as many subchains.
Subchains may be permissioned or permissionless blockchains that allow smart contracts.
The root chain is a public blockchain that focuses on scalability, resilience, privacy-preserving functionalities, and subchain orchestration: it has been deployed to transmit value and data across subchains, supervise the different subchains along with settlement and anchor payments for them.
To ease transaction ordering, the IoTeX root chain employs the UTXO concept.
A subchain, on the other hand, is a blockchain that can be either private o public that uses the root chain to communicate with other subchains.
A subchain's key characteristics are flexibility and extensibility as they are needed to meet the diverse IoT applications. To function, a subchain is typically managed by operators with a strong stake in the root chain.
Additionally, the system lets users choose one or more delegated operators to act for them, with or without a bond. To seal new blocks, the delegator acts like a light client on the root chain, and like a full node on the subchain.
IoTex root chainconsensus delivers immutable blocks in real-time and it employs the so-called Roll-DPoS (randomized delegated proof of stake): token holders vote for their delegates, who are then ranked according to the number of votes they get.
The delegates who received the most votes are known as the “consensus delegates” for the present epoch (1 hour). A randomization method then selects a sub-committee to preserve the agreement and generate new blocks for each new epoch.
Block finality is critical for IoTeX cross-blockchain communications. These interactions are based on simplified payment verification (SPV), a mechanism that allows a lightweight node to authenticate a transaction using a Merkle tree and block headers without downloading the complete blockchain. IoTeX employs two-way pegging (TWP) to allow token transfers to and from subchains.
A core idea of the project is to have and provide final users with trusted devices for the data collection.
Hardware to be considered secured and tamper-proof needs to embed a Trusted Execution Environment, which are extremely secure and segregated enclaves that operate in parallel with a device’s/main machine’s system.
A TEE protects the confidentiality and integrity of all data and processes inside it.
IoTex’s goal is to make the first decentralized machines that can participate in the Internet of Trusted Things autonomously. In this regard, the company made the first hardware device that can’t be manipulated: the Pebble Tracker.
The Pebble Tracker has a TEE and a lot of sensors (GPS, climate, motion, and light) to get information from the real world and turn it into verifiable, blockchain-ready data. In addition to minting digital assets, smart contracts can be used to do things like train machine learning models and make crowdsourced climate indices bringing verifiable and trusted.
Decentralized Identity (DID) is the “root of self-sovereignty” for the IoTeX platform. Unlike other blockchain networks, IoTeX has created a DID system for both individuals and machines. People and devices may interact directly using IoTeX since their IDs are interoperable and standardized. IoTeX DID also enables people and devices to own/control their data and identity over the IoTeX network.
The Industrial Internet Consortium is currently standardizing IoTeX’s DID technology and Identity & Access Management (IAM) architecture (IIC). It can link various application layers and enable user-centric data exchange across global IoT ecosystems with billions of IoT devices and millions of users.
Data oracles are required for smart contracts to access off-chain data. For the blockchain sector, IoTeX is constructing the world’s first data oracles that concentrate on verified real-world data from trustworthy devices, making IoTeX the first mover in this direction.
Real-world data on IoTeX will enable thousands of use cases and new on-chain assets supported by real-world data. As an approved data hub, IoTeX may now “serve” data to other blockchains like Ethereum and Polkadot.
Raullen Chai, Qevan Guo, Xinxin Fan, and Jing Sun are the creators of IoTeX.
In addition to co-founding IoTeX, Raullen Chaiis a consultant at BootUP Ventures and a member of the Industrial Internet Consortium’s Industrial Distributed Ledger Task Group. He formerly served as Uber’s head of cryptocurrency research and development, as well as technical security.
Qevan Guo is also one of Hyperconnect Lab’s co-founders. He worked for Facebook as a researcher and technical director.
Xinxin Fan was a senior research engineer at the Bosch Research and Technology Center in North America prior to co-founding IoTeX. He also worked at the University of Waterloo as a research associate and project manager.
Jing Sun serves as a managing partner at Sparkland Capital. She is a limited partner at Polychain Capital and a Rippling angel investor.
The whole IoTex team is made up of about 30 people including scientists, researchers, and numerous engineers from giants such as Google, Facebook, Uber, and Bosch.
The $IOTX token enables the IoTeX blockchain. IOTX provides numerous utilities to facilitate trustworthy interactions amongst stakeholders, including users, Delegates (miners), application makers, and service providers.
The IOTX token offers financial and reputational incentives to promote decentralized IoTeX Network governance/maintenance. Participants may spend, stake, or burn IOTX to access network resources. Increased demand and value of IOTX will encourage network members to maintain and extend the network.
Delegates stake IOTX to be eligible to participate in consensus, while service providers stake/spend IOTX to provide services to builders.
IOTX has a 10 billion maximum supply and is deflationary — IOTX is burnt for every new device and user registered to the IoTeX Network, rewarding long-term holders.
Following the onboarding of 1 million “Powered by IoTeX” devices, the “Burn-to-Certify” tokenomics will be enabled. Starting from that point on, builders will burn IOTX to access specific services/capabilities for each new device. As seen in the graph below, the overall supply of IOTX will drop with each additional “Powered by IoTeX” device.
Notably, these are the tokenomics that power the IoTeX blockchain, however, apps “Powered by IoTeX” may create their tokens and tokenomics based on their own incentives/rules.
Maximum Supply: 10 Billion IOTX
Total Supply: 8.8 Billion IOTX (after Burn-Drop)
900 Million IOTX (9% of max supply) will be gradually burned as 1 Million devices will be registered and confirmed on IoTeX
MachineFi is a concept used to describe the combination of blockchain and Internet of Things (IoT) technology.
This concept seeks to connect the physical world with the metaverse.
MachineFi also defines a network of smart devices that communicate with one another on the blockchain via the internet. On many fronts, blockchain has developed a robust framework for enabling decentralization.
IoTeX 2.0 intends to decentralize the MachineFi sector, allowing smart device users to engage in a rising trillion-dollar economy free of the constraints imposed by centralized data providers.
The “Proof-of-Anything” idea will be launched in the MachineFi upgrade. This will let IoT devices provide on-chain proofs of real-world events such as health measurements and GPS positions.
Blockchain technology can meet several cybersecurity requirements for IoT devices because it is distributed and can’t be changed.
However, a single blockchain implementation that doesn’t have other ways to deal with complexity, like smart contracts, edge, and cloud computing, can’t meet all of the security requirements that IoT platforms need to meet.
Identity and Access Management is an important part of building a strong defense against intentional risks that IoTex decided to tackle from day 1.
This is not, in any case, financial advice, the goal of my research will always be to dive deep into projects and study them from different angles, I do include personal opinions based on my experience with similar projects that I have recently studied.
I am and will always be open to discussion.
Follow us on Twitter:
Please always do your research before investing in anything.
Information Security Attributes: or qualities, i.e., Confidentiality, Integrity and Availability (CIA). Information Systems are composed in three main portions, hardware, software and communications with the purpose to help identify and apply information security industry standards, as mechanisms of protection and prevention, at three levels or layers: physical, personal and organizational. Essentially, procedures or policies are implemented to tell administrators, users and operators how to use products to ensure information security within the organizations.
Various definitions of information security are suggested below, summarized from different sources:
“Preservation of confidentiality, integrity and availability of information. Note: In addition, other properties, such as authenticity, accountability, non-repudiation and reliability can also be involved.” (ISO/IEC 27000:2009)
“The protection of information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide confidentiality, integrity, and availability.” (CNSS, 2010)
“Ensures that only authorized users (confidentiality) have access to accurate and complete information (integrity) when required (availability).” (ISACA, 2008)
“Information Security is the process of protecting the intellectual property of an organisation.” (Pipkin, 2000)
“…information security is a risk management discipline, whose job is to manage the cost of information risk to the business.” (McDermott and Geer, 2001)
“A well-informed sense of assurance that information risks and controls are in balance.” (Anderson, J., 2003)
“Information security is the protection of information and minimizes the risk of exposing information to unauthorized parties.” (Venter and Eloff, 2003)
“Information Security is a multidisciplinary area of study and professional activity which is concerned with the development and implementation of security mechanisms of all available types (technical, organizational, human-oriented and legal) in order to keep information in all its locations (within and outside the organization’s perimeter) and, consequently, information systems, where information is created, processed, stored, transmitted and destroyed, free from threats. Threats to information and information systems may be categorized and a corresponding security goal may be defined for each category of threats. A set of security goals, identified as a result of a threat analysis, should be revised periodically to ensure its adequacy and conformance with the evolving environment. The currently relevant set of security goals may include: confidentiality, integrity, availability, privacy, authenticity & trustworthiness, non-repudiation, accountability and auditability.” (Cherdantseva and Hilton, 2013)
Information and information resource security using telecommunication system or devices means protecting information, information systems or books from unauthorized access, damage, theft, or destruction (Kurose and Ross, 2010).
At the core of information security is information assurance, the act of maintaining the confidentiality, integrity, and availability (CIA) of information, ensuring that information is not compromised in any way when critical issues arise. These issues include but are not limited to natural disasters, computer/server malfunction, and physical theft. While paper-based business operations are still prevalent, requiring their own set of information security practices, enterprise digital initiatives are increasingly being emphasized, with information assurance now typically being dealt with by information technology (IT) security specialists. These specialists apply information security to technology (most often some form of computer system). It is worthwhile to note that a computer does not necessarily mean a home desktop. A computer is any device with a processor and some memory. Such devices can range from non-networked standalone devices as simple as calculators, to networked mobile computing devices such as smartphones and tablet computers. IT security specialists are almost always found in any major enterprise/establishment due to the nature and value of the data within larger businesses. They are responsible for keeping all of the technology within the company secure from malicious cyber attacks that often attempt to acquire critical private information or gain control of the internal systems.
The field of information security has grown and evolved significantly in recent years. It offers many areas for specialization, including securing networks and allied infrastructure, securing applications and databases, security testing, information systems auditing, business continuity planning, electronic record discovery, and digital forensics. Information security professionals are very stable in their employment. As of 2013 more than 80 percent of professionals had no change in employer or employment over a period of a year, and the number of professionals is projected to continuously grow more than 11 percent annually from 2014 to 2019.
Information security threats come in many different forms. Some of the most common threats today are software attacks, theft of intellectual property, theft of identity, theft of equipment or information, sabotage, and information extortion. Most people have experienced software attacks of some sort. Viruses,worms, phishing attacks, and Trojan horses are a few common examples of software attacks. The theft of intellectual property has also been an extensive issue for many businesses in the information technology (IT) field.Identity theft is the attempt to act as someone else usually to obtain that person’s personal information or to take advantage of their access to vital information through social engineering. Theft of equipment or information is becoming more prevalent today due to the fact that most devices today are mobile, are prone to theft and have also become far more desirable as the amount of data capacity increases. Sabotage usually consists of the destruction of an organization’s website in an attempt to cause loss of confidence on the part of its customers. Information extortion consists of theft of a company’s property or information as an attempt to receive a payment in exchange for returning the information or property back to its owner, as with ransomware. There are many ways to help protect yourself from some of these attacks but one of the most functional precautions is conduct periodical user awareness. The number one threat to any organisation are users or internal employees, they are also called insider threats.
Governments, military, corporations, financial institutions, hospitals, non-profit organisations, and private businesses amass a great deal of confidential information about their employees, customers, products, research, and financial status. Should confidential information about a business’ customers or finances or new product line fall into the hands of a competitor or a black hat hacker, a business and its customers could suffer widespread, irreparable financial loss, as well as damage to the company’s reputation. From a business perspective, information security must be balanced against cost; the Gordon-Loeb Model provides a mathematical economic approach for addressing this concern.
For the individual, information security has a significant effect on privacy, which is viewed very differently in various cultures.
Responses to threats
Possible responses to a security threat or risk are:
reduce/mitigate – implement safeguards and countermeasures to eliminate vulnerabilities or block threats
assign/transfer – place the cost of the threat onto another entity or organization such as purchasing insurance or outsourcing
accept – evaluate if the cost of the countermeasure outweighs the possible cost of loss due to the threat
Since the early days of communication, diplomats and military commanders understood that it was necessary to provide some mechanism to protect the confidentiality of correspondence and to have some means of detecting tampering.Julius Caesar is credited with the invention of the Caesar cipher c. 50 B.C., which was created in order to prevent his secret messages from being read should a message fall into the wrong hands. However, for the most part protection was achieved through the application of procedural handling controls. Sensitive information was marked up to indicate that it should be protected and transported by trusted persons, guarded and stored in a secure environment or strong box. As postal services expanded, governments created official organizations to intercept, decipher, read, and reseal letters (e.g., the U.K.’s Secret Office, founded in 1653).
In the mid-nineteenth century more complex classification systems were developed to allow governments to manage their information according to the degree of sensitivity. For example, the British Government codified this, to some extent, with the publication of the Official Secrets Act in 1889. Section 1 of the law concerned espionage and unlawful disclosures of information, while Section 2 dealt with breaches of official trust. A public interest defense was soon added to defend disclosures in the interest of the state. A similar law was passed in India in 1889, The Indian Official Secrets Act, which was associated with the British colonial era and used to crack down on newspapers that opposed the Raj’s policies. A newer version was passed in 1923 that extended to all matters of confidential or secret information for governance. By the time of the First World War, multi-tier classification systems were used to communicate information to and from various fronts, which encouraged greater use of code making and breaking sections in diplomatic and military headquarters. Encoding became more sophisticated between the wars as machines were employed to scramble and unscramble information.
The establishment of computer security inaugurated the history of information security. The need for such appeared during World War II. The volume of information shared by the Allied countries during the Second World War necessitated formal alignment of classification systems and procedural controls. An arcane range of markings evolved to indicate who could handle documents (usually officers rather than enlisted troops) and where they should be stored as increasingly complex safes and storage facilities were developed. The Enigma Machine, which was employed by the Germans to encrypt the data of warfare and was successfully decrypted by Alan Turing, can be regarded as a striking example of creating and using secured information. Procedures evolved to ensure documents were destroyed properly, and it was the failure to follow these procedures which led to some of the greatest intelligence coups of the war (e.g., the capture of U-570).
In 1973, important elements of ARPANET security were found by internet pioneer Robert Metcalfe to have many flaws such as the: “vulnerability of password structure and formats; lack of safety procedures for dial-up connections; and nonexistent user identification and authorizations”, aside from the lack of controls and safeguards to keep data safe from unauthorized access. Hackers had effortless access to ARPANET, as phone numbers were known by the public. Due to these problems, coupled with the constant violation of computer security, as well as the exponential increase in the number of hosts and users of the system, “network security” was often alluded to as “network insecurity”.
The end of the twentieth century and the early years of the twenty-first century saw rapid advancements in telecommunications, computing hardware and software, and data encryption. The availability of smaller, more powerful, and less expensive computing equipment made electronic data processing within the reach of small business and home users. The establishment of Transfer Control Protocol/Internetwork Protocol (TCP/IP) in the early 1980s enabled different types of computers to communicate. These computers quickly became interconnected through the internet.
The rapid growth and widespread use of electronic data processing and electronic business conducted through the internet, along with numerous occurrences of international terrorism, fueled the need for better methods of protecting the computers and the information they store, process, and transmit. The academic disciplines of computer security and information assurance emerged along with numerous professional organizations, all sharing the common goals of ensuring the security and reliability of information systems.
The CIA triad of confidentiality, integrity, and availability is at the heart of information security. (The members of the classic InfoSec triad—confidentiality, integrity, and availability—are interchangeably referred to in the literature as security attributes, properties, security goals, fundamental aspects, information criteria, critical information characteristics and basic building blocks.) However, debate continues about whether or not this CIA triad is sufficient to address rapidly changing technology and business requirements, with recommendations to consider expanding on the intersections between availability and confidentiality, as well as the relationship between security and privacy. Other principles such as “accountability” have sometimes been proposed; it has been pointed out that issues such as non-repudiation do not fit well within the three core concepts.
The triad seems to have first been mentioned in a NIST publication in 1977.
In 1992 and revised in 2002, the OECD‘s Guidelines for the Security of Information Systems and Networks proposed the nine generally accepted principles: awareness, responsibility, response, ethics, democracy, risk assessment, security design and implementation, security management, and reassessment. Building upon those, in 2004 the NIST‘s Engineering Principles for Information Technology Security proposed 33 principles. From each of these derived guidelines and practices.
In information security, confidentiality “is the property, that information is not made available or disclosed to unauthorized individuals, entities, or processes.” While similar to “privacy,” the two words aren’t interchangeable. Rather, confidentiality is a component of privacy that implements to protect our data from unauthorized viewers. Examples of confidentiality of electronic data being compromised include laptop theft, password theft, or sensitive emails being sent to the incorrect individuals.
In IT security, data integrity means maintaining and assuring the accuracy and completeness of data over its entire lifecycle. This means that data cannot be modified in an unauthorized or undetected manner. This is not the same thing as referential integrity in databases, although it can be viewed as a special case of consistency as understood in the classic ACID model of transaction processing. Information security systems typically incorporate controls to ensure their own integrity, in particular protecting the kernel or core functions against both deliberate and accidental threats. Multi-purpose and multi-user computer systems aim to compartmentalize the data and processing such that no user or process can adversely impact another: the controls may not succeed however, as we see in incidents such as malware infections, hacks, data theft, fraud, and privacy breaches.
More broadly, integrity is an information security principle that involves human/social, process, and commercial integrity, as well as data integrity. As such it touches on aspects such as credibility, consistency, truthfulness, completeness, accuracy, timeliness, and assurance.
For any information system to serve its purpose, the information must be available when it is needed. This means the computing systems used to store and process the information, the security controls used to protect it, and the communication channels used to access it must be functioning correctly.High availability systems aim to remain available at all times, preventing service disruptions due to power outages, hardware failures, and system upgrades. Ensuring availability also involves preventing denial-of-service attacks, such as a flood of incoming messages to the target system, essentially forcing it to shut down.
In the realm of information security, availability can often be viewed as one of the most important parts of a successful information security program. Ultimately end-users need to be able to perform job functions; by ensuring availability an organization is able to perform to the standards that an organization’s stakeholders expect. This can involve topics such as proxy configurations, outside web access, the ability to access shared drives and the ability to send emails. Executives oftentimes do not understand the technical side of information security and look at availability as an easy fix, but this often requires collaboration from many different organizational teams, such as network operations, development operations, incident response, and policy/change management. A successful information security team involves many different key roles to mesh and align for the CIA triad to be provided effectively.
In law, non-repudiation implies one’s intention to fulfill their obligations to a contract. It also implies that one party of a transaction cannot deny having received a transaction, nor can the other party deny having sent a transaction.
It is important to note that while technology such as cryptographic systems can assist in non-repudiation efforts, the concept is at its core a legal concept transcending the realm of technology. It is not, for instance, sufficient to show that the message matches a digital signature signed with the sender’s private key, and thus only the sender could have sent the message, and nobody else could have altered it in transit (data integrity). The alleged sender could in return demonstrate that the digital signature algorithm is vulnerable or flawed, or allege or prove that his signing key has been compromised. The fault for these violations may or may not lie with the sender, and such assertions may or may not relieve the sender of liability, but the assertion would invalidate the claim that the signature necessarily proves authenticity and integrity. As such, the sender may repudiate the message (because authenticity and integrity are pre-requisites for non-repudiation).
Broadly speaking, risk is the likelihood that something bad will happen that causes harm to an informational asset (or the loss of the asset). A vulnerability is a weakness that could be used to endanger or cause harm to an informational asset. A threat is anything (man-made or act of nature) that has the potential to cause harm. The likelihood that a threat will use a vulnerability to cause harm creates a risk. When a threat does use a vulnerability to inflict harm, it has an impact. In the context of information security, the impact is a loss of availability, integrity, and confidentiality, and possibly other losses (lost income, loss of life, loss of real property).
There are two things in this definition that may need some clarification. First, the process of risk management is an ongoing, iterative process. It must be repeated indefinitely. The business environment is constantly changing and new threats and vulnerabilities emerge every day. Second, the choice of countermeasures (controls) used to manage risks must strike a balance between productivity, cost, effectiveness of the countermeasure, and the value of the informational asset being protected. Furthermore, these processes have limitations as security breaches are generally rare and emerge in a specific context which may not be easily duplicated. Thus, any process and countermeasure should itself be evaluated for vulnerabilities. It is not possible to identify all risks, nor is it possible to eliminate all risk. The remaining risk is called “residual risk.“
A risk assessment is carried out by a team of people who have knowledge of specific areas of the business. Membership of the team may vary over time as different parts of the business are assessed. The assessment may use a subjective qualitative analysis based on informed opinion, or where reliable dollar figures and historical information is available, the analysis may use quantitative analysis.
Calculate the impact that each threat would have on each asset. Use qualitative analysis or quantitative analysis.
Identify, select and implement appropriate controls. Provide a proportional response. Consider productivity, cost effectiveness, and value of the asset.
Evaluate the effectiveness of the control measures. Ensure the controls provide the required cost effective protection without discernible loss of productivity.
For any given risk, management can choose to accept the risk based upon the relative low value of the asset, the relative low frequency of occurrence, and the relative low impact on the business. Or, leadership may choose to mitigate the risk by selecting and implementing appropriate control measures to reduce the risk. In some cases, the risk can be transferred to another business by buying insurance or outsourcing to another business. The reality of some risks may be disputed. In such cases leadership may choose to deny the risk.
Selecting and implementing proper security controls will initially help an organization bring down risk to acceptable levels. Control selection should follow and should be based on the risk assessment. Controls can vary in nature, but fundamentally they are ways of protecting the confidentiality, integrity or availability of information. ISO/IEC 27001 has defined controls in different areas. Organizations can implement additional controls according to requirement of the organization.ISO/IEC 27002 offers a guideline for organizational information security standards.
Administrative controls (also called procedural controls) consist of approved written policies, procedures, standards, and guidelines. Administrative controls form the framework for running the business and managing people. They inform people on how the business is to be run and how day-to-day operations are to be conducted. Laws and regulations created by government bodies are also a type of administrative control because they inform the business. Some industry sectors have policies, procedures, standards, and guidelines that must be followed – the Payment Card Industry Data Security Standard (PCI DSS) required by Visa and MasterCard is such an example. Other examples of administrative controls include the corporate security policy, password policy, hiring policies, and disciplinary policies.
Administrative controls form the basis for the selection and implementation of logical and physical controls. Logical and physical controls are manifestations of administrative controls, which are of paramount importance.
An important logical control that is frequently overlooked is the principle of least privilege, which requires that an individual, program or system process not be granted any more access privileges than are necessary to perform the task. A blatant example of the failure to adhere to the principle of least privilege is logging into Windows as user Administrator to read email and surf the web. Violations of this principle can also occur when an individual collects additional access privileges over time. This happens when employees’ job duties change, employees are promoted to a new position, or employees are transferred to another department. The access privileges required by their new duties are frequently added onto their already existing access privileges, which may no longer be necessary or appropriate.
Physical controls monitor and control the environment of the work place and computing facilities. They also monitor and control access to and from such facilities and include doors, locks, heating and air conditioning, smoke and fire alarms, fire suppression systems, cameras, barricades, fencing, security guards, cable locks, etc. Separating the network and workplace into functional areas are also physical controls.
An important physical control that is frequently overlooked is separation of duties, which ensures that an individual can not complete a critical task by himself. For example, an employee who submits a request for reimbursement should not also be able to authorize payment or print the check. An applications programmer should not also be the server administrator or the database administrator; these roles and responsibilities must be separated from one another.
Information security must protect information throughout its lifespan, from the initial creation of the information on through to the final disposal of the information. The information must be protected while in motion and while at rest. During its lifetime, information may pass through many different information processing systems and through many different parts of information processing systems. There are many different ways the information and information systems can be threatened. To fully protect the information during its lifetime, each component of the information processing system must have its own protection mechanisms. The building up, layering on, and overlapping of security measures is called “defense in depth.” In contrast to a metal chain, which is famously only as strong as its weakest link, the defense in depth strategy aims at a structure where, should one defensive measure fail, other measures will continue to provide protection.
Recall the earlier discussion about administrative controls, logical controls, and physical controls. The three types of controls can be used to form the basis upon which to build a defense in depth strategy. With this approach, defense in depth can be conceptualized as three distinct layers or planes laid one on top of the other. Additional insight into defense in depth can be gained by thinking of it as forming the layers of an onion, with data at the core of the onion, people the next outer layer of the onion, and network security, host-based security, and application security forming the outermost layers of the onion. Both perspectives are equally valid, and each provides valuable insight into the implementation of a good defense in depth strategy.
An important aspect of information security and risk management is recognizing the value of information and defining appropriate procedures and protection requirements for the information. Not all information is equal and so not all information requires the same degree of protection. This requires information to be assigned a security classification. The first step in information classification is to identify a member of senior management as the owner of the particular information to be classified. Next, develop a classification policy. The policy should describe the different classification labels, define the criteria for information to be assigned a particular label, and list the required security controls for each classification.
Some factors that influence which classification information should be assigned include how much value that information has to the organization, how old the information is and whether or not the information has become obsolete. Laws and other regulatory requirements are also important considerations when classifying information. The Information Systems Audit and Control Association (ISACA) and its Business Model for Information Security also serves as a tool for security professionals to examine security from a systems perspective, creating an environment where security can be managed holistically, allowing actual risks to be addressed.
The type of information security classification labels selected and used will depend on the nature of the organization, with examples being:
In the business sector, labels such as: Public, Sensitive, Private, Confidential.
In the government sector, labels such as: Unclassified, Unofficial, Protected, Confidential, Secret, Top Secret, and their non-English equivalents.
All employees in the organization, as well as business partners, must be trained on the classification schema and understand the required security controls and handling procedures for each classification. The classification of a particular information asset that has been assigned should be reviewed periodically to ensure the classification is still appropriate for the information and to ensure the security controls required by the classification are in place and are followed in their right procedures.
Access to protected information must be restricted to people who are authorized to access the information. The computer programs, and in many cases the computers that process the information, must also be authorized. This requires that mechanisms be in place to control the access to protected information. The sophistication of the access control mechanisms should be in parity with the value of the information being protected; the more sensitive or valuable the information the stronger the control mechanisms need to be. The foundation on which access control mechanisms are built start with identification and authentication.
Identification is an assertion of who someone is or what something is. If a person makes the statement “Hello, my name is John Doe” they are making a claim of who they are. However, their claim may or may not be true. Before John Doe can be granted access to protected information it will be necessary to verify that the person claiming to be John Doe really is John Doe. Typically the claim is in the form of a username. By entering that username you are claiming “I am the person the username belongs to”.
Authentication is the act of verifying a claim of identity. When John Doe goes into a bank to make a withdrawal, he tells the bank teller he is John Doe, a claim of identity. The bank teller asks to see a photo ID, so he hands the teller his driver’s license. The bank teller checks the license to make sure it has John Doe printed on it and compares the photograph on the license against the person claiming to be John Doe. If the photo and name match the person, then the teller has authenticated that John Doe is who he claimed to be. Similarly, by entering the correct password, the user is providing evidence that he/she is the person the username belongs to.
There are three different types of information that can be used for authentication:
Strong authentication requires providing more than one type of authentication information (two-factor authentication). The username is the most common form of identification on computer systems today and the password is the most common form of authentication. Usernames and passwords have served their purpose, but they are increasingly inadequate. Usernames and passwords are slowly being replaced or supplemented with more sophisticated authentication mechanisms such as Time-based One-time Password algorithms.
After a person, program or computer has successfully been identified and authenticated then it must be determined what informational resources they are permitted to access and what actions they will be allowed to perform (run, view, create, delete, or change). This is called authorization. Authorization to access information and other computing services begins with administrative policies and procedures. The policies prescribe what information and computing services can be accessed, by whom, and under what conditions. The access control mechanisms are then configured to enforce these policies. Different computing systems are equipped with different kinds of access control mechanisms. Some may even offer a choice of different access control mechanisms. The access control mechanism a system offers will be based upon one of three approaches to access control, or it may be derived from a combination of the three approaches.
The non-discretionary approach consolidates all access control under a centralized administration. The access to information and other resources is usually based on the individuals function (role) in the organization or the tasks the individual must perform. The discretionary approach gives the creator or owner of the information resource the ability to control access to those resources. In the mandatory access control approach, access is granted or denied basing upon the security classification assigned to the information resource.
To be effective, policies and other security controls must be enforceable and upheld. Effective policies ensure that people are held accountable for their actions. The U.S. Treasury‘s guidelines for systems processing sensitive or proprietary information, for example, states that all failed and successful authentication and access attempts must be logged, and all access to information must leave some type of audit trail.
Also, the need-to-know principle needs to be in effect when talking about access control. This principle gives access rights to a person to perform their job functions. This principle is used in the government when dealing with difference clearances. Even though two employees in different departments have a top-secret clearance, they must have a need-to-know in order for information to be exchanged. Within the need-to-know principle, network administrators grant the employee the least amount of privilege to prevent employees from accessing more than what they are supposed to. Need-to-know helps to enforce the confidentiality-integrity-availability triad. Need-to-know directly impacts the confidential area of the triad.
Information security uses cryptography to transform usable information into a form that renders it unusable by anyone other than an authorized user; this process is called encryption. Information that has been encrypted (rendered unusable) can be transformed back into its original usable form by an authorized user who possesses the cryptographic key, through the process of decryption. Cryptography is used in information security to protect information from unauthorized or accidental disclosure while the information is in transit (either electronically or physically) and while information is in storage.
Cryptography provides information security with other useful applications as well, including improved authentication methods, message digests, digital signatures, non-repudiation, and encrypted network communications. Older, less secure applications such as Telnet and File Transfer Protocol (FTP) are slowly being replaced with more secure applications such as Secure Shell (SSH) that use encrypted network communications. Wireless communications can be encrypted using protocols such as WPA/WPA2 or the older (and less secure) WEP. Wired communications (such as ITU‑TG.hn) are secured using AES for encryption and X.1035 for authentication and key exchange. Software applications such as GnuPG or PGP can be used to encrypt data files and email.
Cryptography can introduce security problems when it is not implemented correctly. Cryptographic solutions need to be implemented using industry-accepted solutions that have undergone rigorous peer review by independent experts in cryptography. The length and strength of the encryption key is also an important consideration. A key that is weak or too short will produce weak encryption. The keys used for encryption and decryption must be protected with the same degree of rigor as any other confidential information. They must be protected from unauthorized disclosure and destruction, and they must be available when needed.Public key infrastructure (PKI) solutions address many of the problems that surround key management.
The terms “reasonable and prudent person”, “due care“, and “due diligence” have been used in the fields of finance, securities, and law for many years. In recent years these terms have found their way into the fields of computing and information security. U.S. Federal Sentencing Guidelines now make it possible to hold corporate officers liable for failing to exercise due care and due diligence in the management of their information systems.
In the business world, stockholders, customers, business partners, and governments have the expectation that corporate officers will run the business in accordance with accepted business practices and in compliance with laws and other regulatory requirements. This is often described as the “reasonable and prudent person” rule. A prudent person takes due care to ensure that everything necessary is done to operate the business by sound business principles and in a legal, ethical manner. A prudent person is also diligent (mindful, attentive, ongoing) in their due care of the business.
In the field of information security, Harris offers the following definitions of due care and due diligence:
“Due care are steps that are taken to show that a company has taken responsibility for the activities that take place within the corporation and has taken the necessary steps to help protect the company, its resources, and employees.” And, [Due diligence are the] “continual activities that make sure the protection mechanisms are continually maintained and operational.”
Attention should be made to two important points in these definitions. First, in due care, steps are taken to show; this means that the steps can be verified, measured, or even produce tangible artifacts. Second, in due diligence, there are continual activities; this means that people are actually doing things to monitor and maintain the protection mechanisms, and these activities are ongoing.
Organizations have a responsibility with practicing duty of care when applying information security. The Duty of Care Risk Analysis Standard (DoCRA) provides principles and practices for evaluating risk. It considers all parties that could be affected by those risks. DoCRA helps evaluate safeguards if they are appropriate in protecting others from harm while presenting a reasonable burden. With increased data breach litigation, companies must balance security controls, compliance, and its mission.
Roles, responsibilities, and segregation of duties defined
Addressed and enforced in policy
Adequate resources committed
Staff aware and trained
A development life cycle requirement
Planned, managed, measurable, and measured
Reviewed and audited
Incident response plans
This section needs expansion. You can help by adding to it. (January 2018)
An incident response plan (IRP) is a group of policies that dictate an organizations reaction to a cyber attack. Once an security breach has been identified the plan is initiated. It is important to note that there can be legal implications to a data breach. Knowing local and federal laws is critical. Every plan is unique to the needs of the organization, and it can involve skill sets that are not part of an IT team. For example, a lawyer may be included in the response plan to help navigate legal implications to a data breach.
As mentioned above every plan is unique but most plans will include the following:
Good preparation includes the development of an Incident Response Team (IRT). Skills need to be used by this team would be, penetration testing, computer forensics, network security, etc. This team should also keep track of trends in cybersecurity and modern attack strategies. A training program for end users is important as well as most modern attack strategies target users on the network.
This part of the incident response plan identifies if there was a security event. When an end user reports information or an admin notices irregularities, an investigation is launched. An incident log is a crucial part of this step. All of the members of the team should be updating this log to ensure that information flows as fast as possible. If it has been identified that a security breach has occurred the next step should be activated.
In this phase, the IRT works to isolate the areas that the breach took place to limit the scope of the security event. During this phase it is important to preserve information forensically so it can be analyzed later in the process. Containment could be as simple as physically containing a server room or as complex as segmenting a network to not allow the spread of a virus.
This is where the threat that was identified is removed from the affected systems. This could include using deleting malicious files, terminating compromised accounts, or deleting other components. Some events do not require this step, however it is important to fully understand the event before moving to this step. This will help to ensure that the threat is completely removed.
This stage is where the systems are restored back to original operation. This stage could include the recovery of data, changing user access information, or updating firewall rules or policies to prevent a breach in the future. Without executing this step, the system could still be vulnerable to future security threats.
In this step information that has been gathered during this process is used to make future decisions on security. This step is crucial to the ensure that future events are prevented. Using this information to further train admins is critical to the process. This step can also be used to process information that is distributed from other entities who have experienced a security event.
Change management is a formal process for directing and controlling alterations to the information processing environment. This includes alterations to desktop computers, the network, servers, and software. The objectives of change management are to reduce the risks posed by changes to the information processing environment and improve the stability and reliability of the processing environment as changes are made. It is not the objective of change management to prevent or hinder necessary changes from being implemented.
Any change to the information processing environment introduces an element of risk. Even apparently simple changes can have unexpected effects. One of management’s many responsibilities is the management of risk. Change management is a tool for managing the risks introduced by changes to the information processing environment. Part of the change management process ensures that changes are not implemented at inopportune times when they may disrupt critical business processes or interfere with other changes being implemented.
Not every change needs to be managed. Some kinds of changes are a part of the everyday routine of information processing and adhere to a predefined procedure, which reduces the overall level of risk to the processing environment. Creating a new user account or deploying a new desktop computer are examples of changes that do not generally require change management. However, relocating user file shares, or upgrading the Email server pose a much higher level of risk to the processing environment and are not a normal everyday activity. The critical first steps in change management are (a) defining change (and communicating that definition) and (b) defining the scope of the change system.
Change management is usually overseen by a change review board composed of representatives from key business areas, security, networking, systems administrators, database administration, application developers, desktop support, and the help desk. The tasks of the change review board can be facilitated with the use of automated work flow application. The responsibility of the change review board is to ensure the organization’s documented change management procedures are followed. The change management process is as follows
Request: Anyone can request a change. The person making the change request may or may not be the same person that performs the analysis or implements the change. When a request for change is received, it may undergo a preliminary review to determine if the requested change is compatible with the organizations business model and practices, and to determine the amount of resources needed to implement the change.
Approve: Management runs the business and controls the allocation of resources therefore, management must approve requests for changes and assign a priority for every change. Management might choose to reject a change request if the change is not compatible with the business model, industry standards or best practices. Management might also choose to reject a change request if the change requires more resources than can be allocated for the change.
Plan: Planning a change involves discovering the scope and impact of the proposed change; analyzing the complexity of the change; allocation of resources and, developing, testing, and documenting both implementation and back-out plans. Need to define the criteria on which a decision to back out will be made.
Test: Every change must be tested in a safe test environment, which closely reflects the actual production environment, before the change is applied to the production environment. The backout plan must also be tested.
Schedule: Part of the change review board’s responsibility is to assist in the scheduling of changes by reviewing the proposed implementation date for potential conflicts with other scheduled changes or critical business activities.
Communicate: Once a change has been scheduled it must be communicated. The communication is to give others the opportunity to remind the change review board about other changes or critical business activities that might have been overlooked when scheduling the change. The communication also serves to make the help desk and users aware that a change is about to occur. Another responsibility of the change review board is to ensure that scheduled changes have been properly communicated to those who will be affected by the change or otherwise have an interest in the change.
Implement: At the appointed date and time, the changes must be implemented. Part of the planning process was to develop an implementation plan, testing plan and, a back out plan. If the implementation of the change should fail or, the post implementation testing fails or, other “drop dead” criteria have been met, the back out plan should be implemented.
Document: All changes must be documented. The documentation includes the initial request for change, its approval, the priority assigned to it, the implementation, testing and back out plans, the results of the change review board critique, the date/time the change was implemented, who implemented it, and whether the change was implemented successfully, failed or postponed.
Post-change review: The change review board should hold a post-implementation review of changes. It is particularly important to review failed and backed out changes. The review board should try to understand the problems that were encountered, and look for areas for improvement.
Change management procedures that are simple to follow and easy to use can greatly reduce the overall risks created when changes are made to the information processing environment. Good change management procedures improve the overall quality and success of changes as they are implemented. This is accomplished through planning, peer review, documentation, and communication.
ISO/IEC 20000, The Visible OPS Handbook: Implementing ITIL in 4 Practical and Auditable Steps (Full book summary), and ITIL all provide valuable guidance on implementing an efficient and effective change management program information security.
Business continuity management (BCM) concerns arrangements aiming to protect an organization’s critical business functions from interruption due to incidents, or at least minimize the effects. BCM is essential to any organization to keep technology and business in line with current threats to the continuation of business as usual. The BCM should be included in an organizations risk analysis plan to ensure that all of the necessary business functions have what they need to keep going in the event of any type of threat to any business function.
Analysis of requirements, e.g., identifying critical business functions, dependencies and potential failure points, potential threats and hence incidents or risks of concern to the organization;
Specification, e.g., maximum tolerable outage periods; recovery point objectives (maximum acceptable periods of data loss);
Architecture and design, e.g., an appropriate combination of approaches including resilience (e.g. engineering IT systems and processes for high availability, avoiding or preventing situations that might interrupt the business), incident and emergency management (e.g., evacuating premises, calling the emergency services, triage/situation assessment and invoking recovery plans), recovery (e.g., rebuilding) and contingency management (generic capabilities to deal positively with whatever occurs using whatever resources are available);
Implementation, e.g., configuring and scheduling backups, data transfers, etc., duplicating and strengthening critical elements; contracting with service and equipment suppliers;
Testing, e.g., business continuity exercises of various types, costs and assurance levels;
Management, e.g., defining strategies, setting objectives and goals; planning and directing the work; allocating funds, people and other resources; prioritization relative to other activities; team building, leadership, control, motivation and coordination with other business functions and activities (e.g., IT, facilities, human resources, risk management, information risk and security, operations); monitoring the situation, checking and updating the arrangements when things change; maturing the approach through continuous improvement, learning and appropriate investment;
Assurance, e.g., testing against specified requirements; measuring, analyzing, and reporting key parameters; conducting additional tests, reviews and audits for greater confidence that the arrangements will go to plan if invoked.
Whereas BCM takes a broad approach to minimizing disaster-related risks by reducing both the probability and the severity of incidents, a disaster recovery plan (DRP) focuses specifically on resuming business operations as quickly as possible after a disaster. A disaster recovery plan, invoked soon after a disaster occurs, lays out the steps necessary to recover critical information and communications technology (ICT) infrastructure. Disaster recovery planning includes establishing a planning group, performing risk assessment, establishing priorities, developing recovery strategies, preparing inventories and documentation of the plan, developing verification criteria and procedure, and lastly implementing the plan.
Laws and regulations
Privacy International 2007 privacy ranking green: Protections and safeguards red: Endemic surveillance societies
Below is a partial listing of governmental laws and regulations in various parts of the world that have, had, or will have, a significant effect on data processing and information security. Important industry sector regulations have also been included when they have a significant impact on information security.
The UK Data Protection Act 1998 makes new provisions for the regulation of the processing of information relating to individuals, including the obtaining, holding, use or disclosure of such information. The European Union Data Protection Directive (EUDPD) requires that all E.U. members adopt national regulations to standardize the protection of data privacy for citizens throughout the E.U.
The E.U.’s Data Retention Directive (annulled) required internet service providers and phone companies to keep data on every electronic message sent and phone call made for between six months and two years.
The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C.§ 1232 g; 34 CFR Part 99) is a U.S. Federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the U.S. Department of Education. Generally, schools must have written permission from the parent or eligible student in order to release any information from a student’s education record.
The Federal Financial Institutions Examination Council’s (FFIEC) security guidelines for auditors specifies requirements for online banking security.
The Health Insurance Portability and Accountability Act (HIPAA) of 1996 requires the adoption of national standards for electronic health care transactions and national identifiers for providers, health insurance plans, and employers. Additionally, it requires health care providers, insurance providers and employers to safeguard the security and privacy of health data.
The Gramm–Leach–Bliley Act of 1999 (GLBA), also known as the Financial Services Modernization Act of 1999, protects the privacy and security of private financial information that financial institutions collect, hold, and process.
Section 404 of the Sarbanes–Oxley Act of 2002 (SOX) requires publicly traded companies to assess the effectiveness of their internal controls for financial reporting in annual reports they submit at the end of each fiscal year. Chief information officers are responsible for the security, accuracy, and the reliability of the systems that manage and report the financial data. The act also requires publicly traded companies to engage with independent auditors who must attest to, and report on, the validity of their assessments.
State security breach notification laws (California and many others) require businesses, nonprofits, and state institutions to notify consumers when unencrypted “personal information” may have been compromised, lost, or stolen.
The Personal Information Protection and Electronics Document Act (PIPEDA) of Canada supports and promotes electronic commerce by protecting personal information that is collected, used or disclosed in certain circumstances, by providing for the use of electronic means to communicate or record information or transactions and by amending the Canada Evidence Act, the Statutory Instruments Act and the Statute Revision Act.
Greece’s Hellenic Authority for Communication Security and Privacy (ADAE) (Law 165/2011) establishes and describes the minimum information security controls that should be deployed by every company which provides electronic communication networks and/or services in Greece in order to protect customers’ confidentiality. These include both managerial and technical controls (e.g., log records should be stored for two years).
Greece’s Hellenic Authority for Communication Security and Privacy (ADAE) (Law 205/2013) concentrates around the protection of the integrity and availability of the services and data offered by Greek telecommunication companies. The law forces these and other related companies to build, deploy, and test appropriate business continuity plans and redundant infrastructures.
Information security culture
Describing more than simply how security aware employees are, information security culture is the ideas, customs, and social behaviors of an organization that impact information security in both positive and negative ways. Cultural concepts can help different segments of the organization work effectively or work against effectiveness towards information security within an organization. The way employees think and feel about security and the actions they take can have a big impact on information security in organizations. Roer & Petric (2017) identify seven core dimensions of information security culture in organizations:
Attitudes: Employees’ feelings and emotions about the various activities that pertain to the organizational security of information.
Behaviors: Actual or intended activities and risk-taking actions of employees that have direct or indirect impact on information security.
Cognition: Employees’ awareness, verifiable knowledge, and beliefs regarding practices, activities, and self-efficacy relation that are related to information security.
Communication: Ways employees communicate with each other, sense of belonging, support for security issues, and incident reporting.
Compliance: Adherence to organizational security policies, awareness of the existence of such policies and the ability to recall the substance of such policies.
Norms: Perceptions of security-related organizational conduct and practices that are informally deemed either normal or deviant by employees and their peers, e.g. hidden expectations regarding security behaviors and unwritten rules regarding uses of information-communication technologies.
Responsibilities: Employees’ understanding of the roles and responsibilities they have as a critical factor in sustaining or endangering the security of information, and thereby the organization.
Andersson and Reimers (2014) found that employees often do not see themselves as part of the organization Information Security “effort” and often take actions that ignore organizational information security best interests. Research shows information security culture needs to be improved continuously. In Information Security Culture from Analysis to Change, authors commented, “It’s a never ending process, a cycle of evaluation and change or maintenance.” To manage the information security culture, five steps should be taken: pre-evaluation, strategic planning, operative planning, implementation, and post-evaluation.
Pre-Evaluation: to identify the awareness of information security within employees and to analyze current security policy
Strategic Planning: to come up a better awareness-program, we need to set clear targets. Clustering people is helpful to achieve it
Operative Planning: create a good security culture based on internal communication, management buy-in, security awareness, and training programs
Implementation: should feature commitment of management, communication with organizational members, courses for all organizational members, and commitment of the employees
Post-evaluation: to better gauge the effectiveness of the prior steps and build on continuous improvement
The International Organization for Standardization (ISO) is a consortium of national standards institutes from 157 countries, coordinated through a secretariat in Geneva, Switzerland. ISO is the world’s largest developer of standards. ISO 15443: “Information technology – Security techniques – A framework for IT security assurance”, ISO/IEC 27002: “Information technology – Security techniques – Code of practice for information security management”, ISO-20000: “Information technology – Service management”, and ISO/IEC 27001: “Information technology – Security techniques – Information security management systems – Requirements” are of particular interest to information security professionals.
The Internet Society is a professional membership society with more than 100 organizations and over 20,000 individual members in over 180 countries. It provides leadership in addressing issues that confront the future of the internet, and it is the organizational home for the groups responsible for internet infrastructure standards, including the Internet Engineering Task Force (IETF) and the Internet Architecture Board (IAB). The ISOC hosts the Requests for Comments (RFCs) which includes the Official Internet Protocol Standards and the RFC-2196 Site Security Handbook.
The Information Security Forum (ISF) is a global nonprofit organization of several hundred leading organizations in financial services, manufacturing, telecommunications, consumer goods, government, and other areas. It undertakes research into information security practices and offers advice in its biannual Standard of Good Practice and more detailed advisories for members.
The Institute of Information Security Professionals (IISP) is an independent, non-profit body governed by its members, with the principal objective of advancing the professionalism of information security practitioners and thereby the professionalism of the industry as a whole. The institute developed the IISP Skills Framework. This framework describes the range of competencies expected of information security and information assurance professionals in the effective performance of their roles. It was developed through collaboration between both private and public sector organizations, world-renowned academics, and security leaders.
The German Federal Office for Information Security (in German Bundesamt für Sicherheit in der Informationstechnik (BSI)) BSI-Standards 100–1 to 100-4 are a set of recommendations including “methods, processes, procedures, approaches and measures relating to information security”. The BSI-Standard 100-2 IT-Grundschutz Methodology describes how information security management can be implemented and operated. The standard includes a very specific guide, the IT Baseline Protection Catalogs (also known as IT-Grundschutz Catalogs). Before 2005, the catalogs were formerly known as “IT Baseline Protection Manual”. The Catalogs are a collection of documents useful for detecting and combating security-relevant weak points in the IT environment (IT cluster). The collection encompasses as of September 2013 over 4,400 pages with the introduction and catalogs. The IT-Grundschutz approach is aligned with to the ISO/IEC 2700x family.
Blockchain and AI are revolutionizing the way we perceive identity. With virtual identity tokenization, individuals can take ownership of their digital self and protect their data. The impact of this technology is inevitable, and it will change the way we interact with the digital world forever.
The anime classic Ghost in the Shell has been praised for its exploration of transhumanist themes, questioning what it means to be human in a world where artificial intelligence is advancing rapidly. The central question of the film is whether AI is just a shell, or if it is capable of developing true consciousness and emotions.
As our lives become more intertwined with technology, the concept of virtual identity has become increasingly important. From social media profiles to online banking accounts, our virtual identities can have a significant impact on our lives. However, with the rise of AI and other advanced technologies, questions about the ethics of virtual identity are becoming more complex. In this article, we will explore the different systems and technologies that make up virtual identity, as well as the ethical considerations that must be taken into account when developing these systems.
As technology continues to advance, our lives are becoming increasingly intertwined with virtual spaces. From social media platforms to online gaming communities, virtual identities have become an integral part of our daily lives. In these virtual spaces, we have the opportunity to express ourselves, interact with others, and explore new identities. However, as we spend more time in these virtual spaces, it is important that we understand the systems, behaviours, and ethics related to virtual identities.
Virtual Identity and Digital Integrity In today’s digital age, virtual identity has become an integral part of our online existence. It is the representation of who we are in the digital world, and it plays a significant role in our interactions with the online community. However, the growing concern of identity theft and data breaches
04 Feb’23 | By Amit Ghosh As the country pushes its sustainability agenda, the use of new technology deserves a closer look in order to make a difference in this cause When we examine blockchain’s role in environmental, social, and governance (ESG) policies and markets around the world, we can see how technology is already changing ESG
Imagine a world where patients and their families can directly fund scientists developing the next breakthrough drug or treatment that they need. A world in which drug development is a collaborative, open, and decentralized process. Such a future is not only possible, but the decentralized science movement is making it a reality.Through blockchain, crypto, and