Categories
Cloud Hosting

Apache Iceberg promises to change the economics of cloud-based data analytics – The Register

Feature By 2015, Netflix had completed its move from an on-premises data warehouse and analytics stack to one based around AWS S3 object storage. But the environment soon began to hit some snags.

"Let me tell you a little bit about Hive tables and our love/hate relationship with them," said Ted Gooch, former database architect at the streaming service.

While there were some good things about Hive, there were also some performance-based issues and "some very surprising behaviors."

"Because it's not a heterogeneous format or a format that's well defined, different engines supported things in different ways," Gooch now a software engineer at Stripe and an Iceberg committer said in an online video posted by data lake company Dremio.

Out of these performance and usability challenges inherent in Apache Hive tables in large and demanding data lake environments, the Netflix data team developed a specification for Iceberg, a table format for slow-moving data or slow-evolving data, as Gooch put it. The project was developed at Netflix by Ryan Blue and Dan Weeks, now co-founders of Iceberg company Tabular, and was donated to the Apache Software Foundation as an open source project in November 2018.

Apache Iceberg is an open table format designed for large-scale analytical workloads while supporting query engines including Spark, Trino, Flink, Presto, Hive and Impala. The move promises to help organizations bring their analytics engine of choice to their data without going through the expensive and inconvenience of moving it to a new data store. It has also won support from data warehouse and data lake big hitters including Google, Snowflake and Cloudera.

Cloud-based blob storage like AWS S3 does not have a way of showing the relationships between files or between a file and a table. As well as making life tough for query engines, it makes changing schemas and time travel difficult. Iceberg sits in the middle of what is a big and growing market. Data lakes alone were estimated to be worth $11.7 billion in 2021, forecast to grow to $61.07 billion by 2029.

"If you're looking at Iceberg from a data lake background, its features are impressive: queries can time travel, transactions are safe so queries never lie, partitioning (data layout) is automatic and can be updated, schema evolution is reliable no more zombie data! and a lot more," Blue explained in a blog.

But it also has implications for data warehouses, he said. "Iceberg was built on the assumption that there is no single query layer. Instead, many different processes all use the same underlying data and coordinate through the table format along with a very lightweight catalog. Iceberg enables direct data access needed by all of these use cases and, uniquely, does it without compromising the SQL behavior of data warehouses."

In October, BigLake, Google Cloud's data lake storage engine, began support for Apache Iceberg, with Databricks format Delta and Hudi streaming set to come soon.

Speaking to The Register, Sudhir Hasbe, senior director of product management at Google Cloud, said: "If you're doing fine-grained access control, you need to have a real table format, [analytics engine] Spark is not enough for that. We had some discussion around whether we are going with Iceberg, Delta or Hudi, and our prioritization was based customer feedback. Some of our largest customers were basically deciding in the same realm and they wanted to have something that was really open, driven by the community and so on. Snap [social media company] is one of our early customers, all their analytics is [on Google Cloud] and they wanted to push us towards Iceberg over other formats."

He said Iceberg was becoming the "primary format," although Google is committed to supporting Hudi and Delta in the future. He noted Cloudera and Snowflake were now supporting Iceberg while Google has a partnership with Salesforce over the Iceberg table format.

Cloudera started in 2008 as a data lake company based on Hadoop, which in its early days was run on distributed commodity systems on-premises, with a gradual shift to cloud hosting coming later.

Today, Cloudera sees itself as a multi-cloud data lake platform, and in July it announced its adoption of the Iceberg open table format.

Chris Royles, Cloudera's Field CTO, told The Register that since it was first developed, Iceberg had seen steady adoption as the contributions grew from a number of different organizations, but vendor interest has begun to ramp up over the last year.

"It has lots of capability, but it's very simple," he said. "It's a client library: you can integrate it with any number of client applications, and they can become capable of managing Iceberg table format. It enables us to think in terms of how different clients both within the Cloudera ecosystem, and outside it the likes of Google or Snowflake could interact with the same data. Your data is open. It's in a standard format. You can determine how to manage, secure and own it. You can also bring whichever tools you choose to bear on that data."

The result is a reduction in the cost of moving data, and improved throughput and performance, Royles said. "The sheer volume of data you can manage the number of data objects you can manage and the complexity of the partitioning: it's a multiplication factor. You're talking five or 10 times more capable by using Iceberg as a table format."

Snowflake kicked off as a data warehouse, wowing investors with its so-called cloud-native approach to separating storage and compute, allowing a more elastic method than on-prem-based data warehousing. Since its 2020 IPO which briefly saw it hit a value of $120 billion the company has diversified as a cloud-based data platform, supporting unstructured data, machine learning language Python, transactional data and most recently Apache Iceberg.

James Malone, Snowflake senior product manager, told El Reg that cloud blob storage such as that offered by AWS, Google and Azure is durable and inexpensive, put could present challenges when it comes to performance analytics.

"The canonical example is if you have 1,000 Apache Parquet files, if you have an engine that's operating on those files, you have to go tell it if they these 1000 tables with one parquet file a piece or if it is two tables with 500 parquet files it doesn't know," he said. "The problem is even more complex when you have multiple engines operating on the same set of data and then you want things like ACID-compliance and like safe data types. It becomes a huge, complicated mess. As cheap durable cloud storage has proliferated it has also put pressure downward pressure on the problem of figuring out how to do high-performance analytics on top of that. People like the durability and the cost-effectiveness of storage, but they also there's a set of expectations and a set of desires in terms of how engines can work and how you can derive value from that data."

Snowflake supports the idea that Iceberg is agnostic both in terms of the file format and analytics engine. For a cloud-based data platform with a steadily expanding user base, this represents a significant shift in how customers will interact with and, crucially, pay for Snowflake.

The first and smallest move is the idea of external tables. When files are imported into an external table, metadata about the files is saved and a schema is applied on read when a query is run on a table. "That allows you to project a table on top of a set of data that's managed by some other system, so maybe I do have a Hadoop cluster that I have a meta store that that system owns the security, it owns the updates, it owns the transactional safety," Malone said. "External tables are really good for situation like that, because it allows you to not only query the data in Snowflake, but you can also use our data sharing and governance tools."

But the bigger move from Snowflake, currently only available in preview, is its plan to build a brand-new table type inside of Snowflake. It is set to have parity in terms of features and performance with a standard Snowflake table, but uses Parquets as the data format, and Iceberg as the metadata format. Crucially, it allows customers to bring their own storage to Snowflake instead of Snowflake managing the storage for them, perhaps a significant cost in the analytics setup. "Traditionally with the standard Snowflake table, Snowflake provides the cloud storage. With an Iceberg table, it's the customer that provides the cloud storage and that's a huge shift," Malone said.

The move promises to give customers the option of taking advantage of volume discounts negotiated with blob storage providers across all their storage, or negotiate new deals based on demand, and only pay Snowflake for the technology it provides in terms of analytics, governance, security and so on.

"The reality is, customers have a lot of data storage and telling people to go move and load data into your system creates friction for them to actually go use your product and is not generally a value add for the customer," Malone said. "So we've built Iceberg tables in a way where our platform benefits work, without customers having to go through the process of loading data into Snowflake. It meets the customer where they are and still provides all of the benefits."

But Iceberg does not only affect the data warehouse market, it also has an impact on data lakes and the emerging lakehouse category, which claims to be a useful combination of the data warehouse and lake concepts. Founded in 2015, Dremio places itself in the lakehouse category also espoused by Databricks and tiny Californian startup Onehouse.

Dremio was the first tech vendor to really start evangelizing Iceberg, according to co-founder and chief product officer Tomer Shiran. Unlike Snowflake and other data warehouse vendors, Dremio has always advocated an open data architecture, using Iceberg to bring analytics to the data, rather than the other way around, he said. "The world is moving in our direction. All the big tech companies have been built on an open data architect and now the leading banks are moving with them."

Shiran said the difference with Dremio's incorporation of Iceberg is that the company has used the table format to design a platform to support concurrent production workloads, in the same way as traditional data warehouses, while offering users the flexibility to access data where they have it, based on a business-level UI, rather than the approach of Databricks, for example, which is more designed with data scientists in mind.

While Databricks supports both its own Delta table standard and Iceberg, Shiran argues that Iceberg's breadth of support will help it win out in the long run.

"Neither is going away," Shiran said. "Our own query engine supports both table formats, but Iceberg is vendor agnostic and Apache marshals contributions from dozens companies including Netflix, Apple and Amazon. You can see how diverse it is but with Delta, although it is technically open source, Databricks is the sole contributor."

However, Databricks disputes this line. Speaking to The Register in November, CEO and co-founder Ali Ghodsi said there were multiple ways to justify Delta Lake as an open source project. "It's a Linux Foundation. We contribute a lot to it, but its governance structure is in Linux Foundation. And then there's the Iceberg and Hudi, which are both Apache projects."

Ghodsi argued the three table formats Iceberg, Hudi and Delta were similar and all were likely to be adopted across the board by the majority of vendors. But the lakehouse concept distinguishes Databricks from the data warehouse vendors even as they make efforts to adopt these formats.

"The data warehousing engines all say they support Iceberg, Hudi and Delta, but they're not optimized really for this," he said. "They're not incentivized to do it well either because if they do that well, then their own revenue will be cannibalized: you don't need to pay any more for storing the data inside the data warehouse. A lot of this is, frankly speaking, marketing by a lot of vendors to check a box. We're excited that the lakehouse actually is taking off. And we believe that the future will be lakehouse-first. Vendors like Databricks, like Starburst, like Dremio will be the way people want to use this."

Nonetheless, database vendor Teradata has eschewed the lakehouse concept. Speaking to The Register in October, CTO Stephen Brobst argued that a data lake and data warehouse should be discrete concepts within a coherent data architecture. The argument plays to the vendor's historic strengths in query optimization and supporting thousands of concurrent users in analytics implementations which include some of the world's largest banks and retailers.

Hyoun Park, CEO and chief analyst at Amalgam Insights, said most vendors are likely to support all three table formats Iceberg, Delta and Hudi in some form or other, but Snowflake's move with Iceberg is the most significant because it represents a departure for the data warehouse firm in terms of its cost model, but also how it can be deployed.

"It's going to continue to be a three-platform race, at least for the next couple of years, because Hudi benchmarks as being slower than the other two platforms but provides more flexibility in how you can use the data, how you can read the data, how you can ingest the data. Delta Lake versus Iceberg tends to be more of a commercial decision because of the way that the vendors have supported this basically, Databricks on one side and everybody else on the other," he said.

But when it comes to Snowflake, the argument takes a new dimension. Although Iceberg promises to extend the application of the data warehouse vendor's analytics engine beyond its environment potentially reducing the cost inherent in moving data that will come at a price: the very qualities that made Snowflake so appealing in the first place, Park said.

"You're now managing two technologies rather than simply managing your data warehouse which was which is the appeal of Snowflake," he said. "Snowflake is very easy to get started as a data warehouse. And that ease of use is the kind of that first hit, that drug-like experience, that gets Snowflake started within the enterprise. And then because Snowflakes pricing is so linked to data use, companies quickly find that as their data grows 50, 60, 70, or 100 percent per year. Their Snowflake bills increase just as quickly. Using Iceberg tables is going to be a way to cut some of those costs, but it comes at the price of losing the ease of use that Snowflake has provided."

Apache Iceberg surfaced in 2022 as a technology to watch to help solve problems in data integration, management and costs. Deniz Parmaksz, machine learning engineer with customer experience platform Insider, recently claimed it cut their Amazon S3 Cost by 90 percent.

While major players including Google, Snowflake, Databricks, Dremio and Cloudera have set out their stall on Iceberg, AWS and Azure have been more cautious. With Amazon Athena, the serverless analytics service, users can query Iceberg data. But Azure Ingestion from data storage systems that provide ACID functionality on top of regular Parquet format files such as Iceberg, Hudi, Delta Lake are not supported. Microsoft has been contacted for clarity on its approach. Nonetheless, in 2023, expect to see more news on the emerging data format which promises to shake up the burgeoning market for cloud data analytics.

Here is the original post:

Apache Iceberg promises to change the economics of cloud-based data analytics - The Register

Categories
Cloud Hosting

MSP vs Vms: What Are the Differences? – StartupGuys.net

Managed service providers (MSPs) have become the norm for businesses that cant handle the expenses of a full-time IT department. As such, businesses are often comparing MSPs to virtual managed services (VMS).

So, is one better than the other? Before you make your decision, you need to know the difference.

Check out our guide below to learn all about MSP vs VMS and which one is right for your company.

MSPs can manage a companys entireIT infrastructure, from servers and storage to networks and applications. The goal of MSPs is to reduce IT overhead and reduce the complexity of managing IT services. An MSP (Managed Service Provider) is an organization that provides businesses with the following:

MSPs offer services such as:

They can also provide services like:

Many MSPs offer a range of solutions, ranging from basic to advanced, to meet the needs of clients.

VMS (Virtual Managed Services) is a type of IT service management that provides users with virtualized IT infrastructure capabilities. It creates a network infrastructure that is:

This can be done without the added costs associated with traditionalphysical IT equipmentand equipment maintenance. VMS virtual managed services enable customers to get the most from their IT investments. This is also done without the cost and complexity of managing and maintaining physical infrastructure.

VMS services typically includeall stuff needed for the virtual network, such as:

Additionally, VMS virtual managed services provide customers with access to the following:

All these allow them to free up staff time to do more strategic work. A bonus means ensuring that their IT needs will continue to be met. VMS enables businesses to save time and money while ensuring their IT infrastructure is meeting their organizational needs.

MSP is a model where a business outsources certain professionals or technology services to a managed service provider. This allows businesses to concentrate on their core business functions. This allows them to leverage a service providers expertise to manage and optimize their IT operations.

VMS are virtual machines. These of which are virtualized systems running on one or more physical machines. VMS is an efficient way for businessesthatwant to reduce workloads. This results in reducing costs and improving performance.

Understanding these core concepts is essential for businesses. This is especially if they are looking to increase efficiency and maximize their profitability. Leveraging the tools and expertise of an MSP and taking advantage of the benefits of a VMS are key steps in improving an organizations IT operations in todays market.

When it comes to comparing MSP and VMS, there are pros and cons for both.

MSP offers a comprehensive customer tracking and reporting system. This comes along with the power of the Microsoft suite of products.The idea can be great for businesses that have a lot of customer data to manage or tasks that require a lot of collaboration.

It does, however, have an associated cost, and users need to keep up to date with the latest version if they want to take advantage of the latest features. However, you can also findaffordable MSPtoolsthat can still fit your business needs.

VMS is a cloud-based project management solution. They provide a much simpler way of running projects with no upfront cost and the ability to roll out changes quickly. However, the scalability might be limited, and certain features can be missing compared to MSP.

Ultimately, the decision between MSP and VMS depends on the particular business needs and the associated budget.

MSP versus VMS (Virtual Machine Services) is an intriguing comparison when it comes to platform flexibility. MSP is a managed service provider that offers cost savings on IT services, while VMS is a system that provides cloud computing, virtualization, and other platforms.

Both platforms offer a variety of services and capabilities. For example, VMS offers users the ability to deploy applications in the cloud, while MSPs can provide a customized experience with dedicated hosting and a variety of services.

MSPs are more customizable compared to VMS. It allows clients to access the ability to tailor their hosting environment and applications specific to their needs. However, VMS typically offers more features than individual MSPs.

VMS provides scalability with the ability to scale up or down easily. This flexibility gives users the ability to adjust their environment as needs change.

Overall, both MSPs and VMS offer platform flexibility for varying degrees. The difference lies in what you need to suit individual needs better. Before making a decision, it is important to weigh the pros and cons of each option to find the right solution.

MSP and VMS models have quickly become the go-tosoftware solutionsfor businesses. These businesses are especially those seeking cost-effective applications.

However, the world of MSP vs VMS programming can be hard to navigate. Oftentimes, businesses struggle to make sense of the complexity between the two.

Thats why businesses should always look for the best-fit solution. They must ensure they get the most out of their investment. Its important to weigh the pros and cons of each model and determine which one best aligns with the businesss goals.

Additionally, they should also consider important factors such as:

By thoroughly assessing the cost-benefit of MSP and VMS, businesses can make informed decisions that provide better cost-effective solutions for their needs.

Comparing MSP vs VMS and the many benefits of each can help you decide the best option for your company. Crucial for success is to figure out a plan that works for your company and your budget.

Take the time to weigh the pros and cons and get the most out of your plan. Dont hesitate to contact an experienced provider for advice and guidance. Make the decision today that helps secure your future success!

Should you wish to read more articles aside from this basicMSP guide andVMS guide, visit our blog.

View post:

MSP vs Vms: What Are the Differences? - StartupGuys.net

Categories
Cloud Hosting

5 Unstoppable Metaverse Stocks to Buy in 2023 – The Motley Fool

Some investors could have a case of the "metaverse mehs." There was a lot of buzz initially about a virtual universe that could generate trillions of dollars in annual revenue. But after the hype wore off, many once high-flying stocks with metaverse connections lost their luster.

However, the long-term potential for the metaverse remains as great as ever. And many of the companies that are best positioned to succeed in the future have other businesses that are already lucrative. Their stocks give investors a way to profit on the metaverse without betting the farm on it.Here are five unstoppable metaverse stocks to buy in 2023 (listed in alphabetical order).

Amazon (AMZN 2.17%) probably isn't the first stock that comes to mind when you think about the metaverse. However, the company could nonetheless profit tremendously from it. The metaverse will almost certainly operate in the cloud -- and Amazon Web Services (AWS) ranks as the largest cloud-hosting provider in the world.

Metaverse or not, AWS stands out as one of the top reasons to buy Amazon stock in 2023. The cloud-hosting unit continues to grow robustly. One analyst even thinks that AWS is on track to be worth $3 trillion down the road. Amazon as a whole is valued at around $870 billion right now.

Amazon stock is currently down close to 50% below its previous high. This steep decline is largely the result of macroeconomic headwinds and the company's higher spending. The former issue should only be temporary, while Amazon is already taking steps to address the latter. If you're looking for a surefire winner in the next bull market, I think Amazon is a top contender.

Apple (AAPL -3.74%) stands out as another company that isn't as closely identified with the metaverse as some other tech giants. However, that could soon change. Apple reportedly plans to launch a mixed-reality headset, potentially as soon as 2023. Analyst Ming-Chi Kuo, who closely follows the company, predicts that the new device will be "a game-changer for the headset industry." It could also signal a strong opening salvo for Apple's entrance into the metaverse race.

However, Apple doesn't need the metaverse to be successful. The company's iPhone-centered ecosystem continues to generate several hundred billion dollars in sales each year. Augmented reality and 5G adoption should be key drivers of this revenue growth regardless of whether or not the metaverse achieves its potential.

Apple also has other growth drivers. Advertising could become the company's next $10 billion business sooner than expected. Apple also has a major opportunity in the fintech market with Apple Pay. With the stock down close to 30%, buying Apple in the new year could pay off nicely when the inevitable market rebound comes.

No company has tied its fortunes to the metaverse in a more high-profile way thanMeta Platforms (META 3.66%). Once known as Facebook, Meta arguably has the grandest vision of what the metaverse could become. It's investing billions of dollars in building hardware and software to make the metaverse dream a reality.

But while the metaverse could be Meta's future, its social media empire pays the bills for now. Many investors are concerned that Facebook and Instagram are losing their appeal. However, Meta still raked in $27.7 billion in revenue in the third quarter of 2022 with $4.4 billion in profits. The number of daily active users on its social platforms increased both year over year and sequentially to 2.93 billion.

NYU professor Aswath Damodaran believes that there's pretty much all upside for Meta based on its valuation. When one of the most influential valuation experts in the world says that, it deserves attention. I'm not sure if Meta stock will necessarily be a big winner in 2023. Buying the stock in the new year, though, could pay off in a huge way over the long run.

Microsoft (MSFT -0.10%) is already partnering with Meta to bring its Teams collaboration software into the metaverse. The company is already a major player in gaming with Xbox. And if its pending acquisition of Activision Blizzard isn't blocked by regulators, Microsoft could become an even bigger force in the metaverse.

Like several others on this list, though, Microsoft's fortunes don't hinge on the metaverse. The company is the 800-pound gorilla in multiple massive markets, including operating systems and office productivity software.

Why buy Microsoft in 2023? For one thing, it's available at a discount with shares down nearly 30% year to date. Microsoft also has several avenues to jump-start its growth, including its Azure cloud-hosting unit and the move into digital advertising.

I'd putNvidia (NVDA -2.05%) near the top of the list of clear winners if the metaverse takes off as much as some think it will. The company's graphics processing units (GPUs) are the gold standard in running gaming apps. That advantage will likely carry over into the metaverse world.

However, Nvidia's opportunities extend far beyond gaming. In the third quarter of 2022, nearly 65% of the company's total revenue of $5.93 billion came from its data center segment. Professional visualization (which includes Nvidia's Omniverse metaverse platform) and automotive and embedded systems businesses contributed a little over $500 million of the total as well.

Sure, Nvidia has taken a beating this year. The stock is still down nearly 50% year to date even after rising quite a bit in recent months. But Nvidia plans to launch its new Grace CPU Superchip in 2023. Its data center and automotive units should continue to perform well in the new year. If the gaming market begins to recover in 2023, look for Nvidia stock to soar.

Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fools board of directors. Keith Speights has positions in Amazon.com, Apple, Meta Platforms, Microsoft, and Nvidia. The Motley Fool has positions in and recommends Activision Blizzard, Amazon.com, Apple, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends the following options: long March 2023 $120 calls on Apple and short March 2023 $130 calls on Apple. The Motley Fool has a disclosure policy.

Original post:

5 Unstoppable Metaverse Stocks to Buy in 2023 - The Motley Fool

Categories
Cloud Hosting

Top 10 Middle East IT stories of 2022 – ComputerWeekly.com

This year has seen the Middle East region host one of the worlds biggest sporting events for the first time, when the FIFA World Cup arrived in Qatar in November.

Not only did the oil-rich nation face massive construction challenges, with stadiums and other physical infrastructure needed to host such a large and prestigious event, but it also had to be ready for inevitable cyber attacks.

Cyber security features heavily in this yearly review, with analysis of projects in the United Arab Emirates (UAE) and Saudi Arabia.

Hosting major sporting events might be something countries in the Middle East aspire to do more often as they diversify their economies and reduce their reliance on oil revenues. This top 10 also features articles about some of the new industries being created in the region, the huge sums being invested, as well as some of the challenges being faced.

Here are Computer Weeklys top 10Middle East IT stories of 2022.

Qatar hosts the FIFA World Cup this year the first time the event has been staged in the Arab world. Cyber security experts in the country predicted that ticketing, hotel bookings and restaurant reservations would be faked by hackers to capture personal data from people travelling to Qatar.

Also, phishing and social engineering was expected to be used to steal personal and financial information from anyone using the internet to get information about the tournament.

Saudi Arabias job market is largely shaped by the push for Saudization, a colloquial term for a movement that is officially called nationalisation.

Part of this push is a set of regulations called Nitaqat, which falls under the jurisdiction of the Ministry of Labour and Social Development, and requires organisations operating in Saudi Arabia to maintain certain percentages of Saudi nationals in their workforce.

A group of Google workers and Palestinian rights activists are calling on the tech giant to end its involvement in the secretive Project Nimbus cloud computing contract, which involves the provision of artificial intelligence and machine learning tools to the Israeli government.

Calls for Google to end its involvement in the contract follow claims made by Ariel Koren, a product marketing manager at Google for Education since 2015 and member of the Alphabet Workers Union, that she was pressured into resigning as retaliation for her vocal opposition to the deal.

A survey has revealed that UAE residents believe 3D printing technology will become widespread in the country, and expect it to have the most positive impact on society.

The online survey of more than 1,000 UAE citizens, carried out by YouGov, asked them for their opinions on 16 emerging technologies. According to YouGov: Data shows that of all the 16 listed technologies, UAE residents have most likely heard a lot about or have some awareness of cryptocurrency, virtual reality, self-driving cars and 3D printing.

The distinction between protecting information technology and protecting operational technology (OT) became very clear in 2010, when the Iranian nuclear enrichment facility Natanz was attacked by Stuxnetmalware.

OT includes programmable logic controllers, intelligent electronic devices, human-machine interfaces and remote terminal units that allow humans to operate and run an industrial facility using computer systems.

In a region that is experiencing an unprecedented increase in cyber security threats, the UAE is taking actions that are already paying off.

The increase in threats is described in the State of the market report 2021 and the State of the market report 2022 annual reports published by Help AG. These studies focus exclusively on digital security in the Middle East, highlighting the top threats and the sectors most impacted, and providing advice on where companies should invest their resources.

In September 2021, the Abu Dhabi Department of Health announced that it would create a drone delivery system to be used to deliver medical supplies medicine, blood units, vaccines and samples between laboratories, pharmacies and blood banks across the city.

The first version of the system will be based on a network of 40 different stations that drones fly in and out of. Over time, the number of stations is expected to grow.

Middle East-based IT leaders expect IT budgets for 2022 to be equal to, or above, pre-pandemic levels, with security spending expected to take the biggest share.

According to this years TechTarget/Computer Weekly annual IT Priorities survey, 63% of IT decision-makers in the Middle East region are planning to increase their IT budgets by 5% or more in 2022.

Accenture is to head up a consortium to develop and support a national payments infrastructure in the UAE that will enable next-generation payments.

Alongside suppliers G42 and SIA, the digital payments arm of Nexi Group, Accenture was selected by the Central Bank of the UAE to build and operate the UAEs National Instant Payment Platform over the next five years.

Saudi Arabia is investing $6.4bn in the digital technologies of the future and the tech startups that will harness them.

The announcement was made during a major new tech event, known as LEAP, in the Saudi capital Riyadh.

Go here to see the original:

Top 10 Middle East IT stories of 2022 - ComputerWeekly.com

Categories
Cloud Hosting

Potential cloud protests and maybe, finally, more JADC2 jointness … – Breaking Defense

Pentagon grapples with growth of artificial intelligence. (Graphic by Breaking Defense, original brain graphic via Getty)

WASHINGTON After military information technology and cybersecurity officials ring in the new year, theyll be coming back to interesting challenges in an alphabet soup of issues: JWCC, JADC2 and CDAO, to name a few.

Of all the things that are likely to happen in the network and cyber defense space, those are three key things Im keeping an especially close eye on in 2023. Heres why:

[This article is one of many in a series in which Breaking Defense reporters look back on the most significant (and entertaining) news stories of2022and look forward to what2023may hold.]

Potential JWCC Protests

On Dec. 7, the Pentagon awarded Amazon Web Services, Google, Microsoft and Oracle each a piece of the $9 billion Joint Warfighting Cloud Capability contract after sending the companies direct solicitations back in November.

Under the effort, the four vendors will compete to get task orders. Right now, its unclear when exactly the first task order will be rolled out or how many task orders will be made.

Its also possible that just like the Joint Enterprise Defense Infrastructure contract, JWCC could be mired in legal disputes, particularly when it comes to which vendor gets what task order.

As you know, with any contract, a protest is possible, Lt. Gen. Robert Skinner, director of the Defense Information Systems Agency, told reporters Dec. 8 following the JWCC awards. What we really focused on was, Here are the requirements that the department needs. And based on those requirements, we did an evaluation, we did market research, we did evaluation to see whichUS-based [cloud service providers] were able to meet those requirements The decision based on whether theres a protest or not really didnt play into it because we want to focus on the requirements and who could meet those requirements.

Sharon Woods, director of DISAs Hosting and Compute Center, said at the same briefing that under the acquisition rules, the task orders, theres a $10 million threshold and a $25 million threshold on protests.

So its really dependent on how large the task order is, she added.

If there is a protest, the DoD could potentially see delays in a critical program its been trying to get off the ground for years now.

A New Office To Oversee JADC2

After a year of a lot of back and forth about the Pentagons Joint All Domain Command and Control effort to better connect sensors to shooters, a new office has been stood up with the aim of bringing jointness to the infamously nebulous initiative.

In October, DoD announced the creation of the Acquisition, Integration and Interoperability Office housed within the Office of the Secretary of Defense. Dave Tremper, director of electronic warfare in Office of the Undersecretary of Defense for Acquisition and Sustainment, will lead the office, and the first task will be finding how to truly get JADC2 across the department, Chris ODonnell, deputy assistant secretary of defense for platform and weapon portfolio management in OUSD (A&S), said Oct. 27.

The creation of the office came a few months after Deputy Defense Secretary Kathleen Hicks said she wanted more high-level oversight of JADC2 and following complaints from military service officials.

Tracking The CDAO

Itll be interesting to see what the new Chief Digital and Artificial Intelligence Officer Craig Martell and his office will accomplish over the next year. Martell, a former Lyft exec, was tapped as the Pentagons first CDAO earlier in 2022.

As CDAO, Martell has some big responsibilities and cant pull on any prior Pentagon experience. When the CDAO officially stood up June 1, the office absorbed the Joint AI Center, Defense Digital Service and Office of Advancing Analytics all key parts of the Pentagons technology network. And there are plans to permit the chief data officer to report directly to the CDAO. (The CDO is operationally aligned to the office and has been rolled into one of its directorates, according to an internal DoD memorandum that was obtained by Breaking Defense in May.)

Already Martells priorities have slightly shifted: He initially thought his job would entail producing tools for DoD to do modeling, but over the first few months on the job, theres been a focus on driving high quality data. During his remarks at the DIA DoDIIS Worldwide Conference Dec. 13, Martell said what most people think and demand of artificial intelligence is magical pixie dust.

What theyre really saying is, excuse my language, Damn, I have a really hard problem and wouldnt it be awesome if a machine could solve it for me? he said. But what we really can deliver in lieu of that because Im here to tell you that we cant deliver magical pixie dust, sorry but what we can deliver is really high quality data.

Martell is also working to further other DoD efforts like zero trust, the Joint Warfighting Cloud Capability and JADC2. The Pentagon has set an ambitious goal of implementing zero trust across the department by 2027 and released a zero-trust strategy in November. The question remains as to what exactly a full implementation of zero trust will look like.

See the rest here:

Potential cloud protests and maybe, finally, more JADC2 jointness ... - Breaking Defense

Categories
Cloud Hosting

Double Down On Innovation With Edge Computing | – Spiceworks News and Insights

Cloud services do not offer enough power for latency-intensive next-gen applications like AI, machine learning, robotics, smart cities, and automated manufacturing. These applications require faster computational speed than allowed in a centralized public cloud location. Edge offers a much faster computational speed with greater reliability and data processing in distributed locations closer to where data is generated, says Steve Grabow, SVP of Lumen Technologies.

The move to the cloud brought an interesting benefit for savvy business leaders: Freedom and focus, allowing them to get in an innovative state of mind. Rather than only operating an IT infrastructure, business innovators found the cloud freed them up to focus on developing innovative new applications.

The cloud also helped businesses double down on innovation. With the cloud, they could make critical applications available to personnel regardless of their location. It helped with cutting costs and consolidating valuable data.

But innovation wont stay only in the cloud alone. The next direction for these innovators to go is to the edge recognizing its the next frontier for innovation. They will move there for speed but garner a range of other benefits. While the cloud remains a great option for hosting many applications and workloads, it may not be the best place for many of todays latency-sensitive, interaction-intensive applications. Heres why.

Artificial intelligence, machine learning, robotic automation, and video analytics require near real-time data processing. We are talking about tremendous amounts of data that must be acquired, analyzed, and acted upon nearly instantly to produce the desired outcome.

This is where edge computing comes into play. Integrating edge computing with cloud computing offers faster data processing for these next-generation applications. Here is why the edge is where its all about.

With edge computing, companies can shorten the physical distance that data needs to travel to reach a public cloud location or on-premises data center, resulting in reduced latency. A large, centralized data center could be hundreds of miles from where data is produced and collected. Edge computing allows businesses to place computing and storage where digital interactions occur. Edge devices can collect, process, and store data in a more distributed fashion than the cloud alone, leading to quicker response times and reduced latency.

There are other benefits to shortening the distance data needs to travel, and that is improved security. Security remains a top priority for global IT decision-makers. 80% are concerned about data security. By physically isolating data, applications, and other resources at the edge, customers can achieve more privacy and security than the cloud. This reduces risk and allows businesses to support industry-specific regulatory or compliance requirements better.

Many edge devices include built-in security capabilities that protect data that lives or transports through devices and out of centralized data centers. As the data lives at the farthest reaches or edge of a companys network, it becomes easier to isolate and protect key data sources allowing autonomous operations to avoid disruption.

The idea behind edge computing is beautifully simple: When you cant get your data closer to your data center, you move your data center closer to your data. Think about it, applications that would power robotics dont have time to travel to the public cloud to get the needed performance. Applications that require high volumes of interactions, like streaming analytics, are subject to costly fees for moving that data in and out of the cloud.

Enter edge data centers, which can facilitate much faster response times. 77% of global IT decision-makers say that only edge computing can solve their latency challenges, and 56% say that their mission-critical applications will require five milliseconds of latency or less in the coming years. Thats because data flow must be seamless and data processing lightning fast to enable things like cashier-less checkout at retail or robotic automation on the manufacturing floor. If a system or application takes too long to respond or put in place an action, the result will be a poor customer experience.

Edge computing can deliver that ultra-low latency and improved reliability. Processing data locally at the edge also reduces the traffic flowing to and from central servers, which can cut those costly data transfer fees and improve overall network performance.

See More: 4 Reasons Why Enterprise Innovation Will Come From the Cloud

Cloud computing and edge computing are distinct but complementary. However, a central cloud often cant provide the performance, low latency, and scale businesses need for their next-generation applications. A more agile, distributed environment that extends the cloud is now necessary.

Edge computing brings a unique combination of local computing and storage with built-in security and unified orchestration. Businesses can still manage applications with similar ease as the cloud but achieve reduced latency and better performance.

This does not mean that the edge replaces the public cloud or data centers. Rather, the edge is an additional place to run workloads when it makes sense and when end users can benefit from more speed.

According to Gartner, enterprise-generated data processed outside a traditional data center or cloud will jump to 75% by 2025. On their own, more than cloud services are needed to manage this growing amount of data. Businesses are grappling with creating better ways of acquiring data, analyzing it for insights, and acting upon it to drive a specific outcome. With edge computing, businesses can bring millions of smaller cloud environments closer to billions of connected devices to do more with their data.

Driving unique digital experiences is how businesses compete and win. And at the heart of those digital experiences is data. Innovators will lead the way to the edge. Future-ready businesses will be prepared at the edge for the data-intensive applications of the 4th Industrial Revolution.

Why do you think data will require edge and cloud in the future? Let us know on Facebook, Twitter, and LinkedIn.

Original post:

Double Down On Innovation With Edge Computing | - Spiceworks News and Insights

Categories
Cloud Hosting

Simplifying digital sovereignty in a multi-cloud world – The Register

Sponsored Feature Sovereignty has traditionally been defined as the ability for a state to rule itself and its subjects, and it's been on the agenda since civilisation began. But only recently has digital sovereignty - the ability to control and make decisions about your own digital assets emerged to become an issue in its own right.

"Broadly speaking, digital sovereignty means having control of your digital destiny," explains Tim Phipps, director of cloud alliances at French technology group Thales. "One level below that, it means that you're in full control of the software and the hardware and the data that your business relies on."

Having control of your digital destiny might not have seemed important when all IT did was run a batched payroll. Today, when companies live and die based on their use of technology which spans multiple devices, systems, applications, workloads and hosting locations, it's a much bigger deal.

It's a particular problem for companies using cloud service providers that don't tend to keep their data in one place anymore. Over 90 percent of organisations now have a multi-cloud strategy according to a recent Thales Threat Report. It also found companies mixing these multi-cloud environments with on-premises and collocated data centre operations, muddying the waters further.

That presents a number of challenges, including the costs involved in managing multiple encryption key stores and management processes as more data moves into the cloud for example. That means organisations often have to employ different teams to run different key management solutions (a recruitment challenge in itself). They also run the risk of extending the attack surface for hackers by fielding a series of disjointed data security solutions each of which operates differing security policies and processes.

Like the internet itself then, digital sovereignty sounds simple enough but is a deceptively complex concept with many moving parts. To help organisations fully understand and address these challenges, Thales breaks sovereignty down further into three elements: data, operations, and software.

Data sovereignty

Data is what many people think of first when they hear the term digital sovereignty. This is a state's ability to protect its own data, and that of its citizens, from intrusion by other states. This has been a defining issue at the heart of internet governance during the last two decades.

"The thing that nations were concerned about, especially in Europe, was that a lot of that cloud providers (or hyperscalers) are all American," Phipps points out. "There were all sorts of concerns around these US spy laws."

The Patriot Act included language that granted US authorities possible access to cloud service providers' business records, which Microsoft later admitted put the privacy of non-domestic customers' records in jeopardy if the US government asked for them. The CLOUD Act, introduced in 2018, later solidified government investigators' ability to obtain files from companies processing data on foreign soil.

At the same time, there was an ongoing tussle with privacy advocates over data sharing between US and European companies. The Safe Harbour provision allowed US companies transfer to data from EU partners to the US if they promised to abide by several privacy principles. Privacy advocate and lawyer Max Schrems challenged this provision in 2015, which led to it being struck down and eventually replaced with the EU-US Privacy Shield agreement. Schrems challenged that, too, and it was declare invalid in 2020.

Now, the EU and US are taking a third stab at it with the Trans-Atlantic Privacy Framework. A draft adequacy decision, that attempts to replace the "Privacy Shield Decision" that was subsequently invalidated by Schrems II, has very recently been published by EU.

Initial feedback from privacy advocate Max Schrems is that the draft decision is almost wholly based on the known Executive Order that was previously thrown out. The expectation is that Max Schrems' team is likely to challenge this in the European Courts. Consequently, companies on both sides of the Atlantic remain unclear about what happens next and this can cause digital transformation projects to stall.

Phipps recommends that organisations put appropriate controls in place to protect their digital assets, so that they can own their own data sovereignty and speed up their journey to the cloud independently of geopolitical change.

However, this can present a considerable challenge in a multi-cloud and hybrid environment. Almost one in five respondents to the Thales survey said that they did not know where all their data is stored. Around half said that managing their sensitive data in multi-cloud environments is more difficult than looking after it on-prem. It's a situation that can have grave ramifications. The Thales survey also found 35 percent of respondents suffered data breaches or failed audits of cloud-based data and applications in the last year.

Operational and software sovereignty

The second class of digital sovereignty is operational, Phipps continues. "This is where you've got your data in the cloud and you're worried about an insider threat or bad actors," he says. "That could be a cloud engineer, but it could also be your own people." A rogue employee intending to seek personal financial gain, could pilfer your data, as could a disgruntled worker of your own, potentially at the behest of a third party.

The threat to operational sovereignty might also appear in the form of malware, a corrupted application, or ransomware that has harvested the login credentials of a privileged user to gain escalated access to sensitive data or systems.

Finally, Thales lists software sovereignty as an issue for companies. "This is the ability to run your workloads wherever you want," Phipps says.

Companies increasingly want choice when they go to the cloud. They might want to run most of their workloads with a specific cloud provider to get improved commercial terms, he explains. But regulators are concerned that, if there's no possibility to perform a controlled and prompt stressed exit, those companies are effectively putting all their eggs in one basket. Organisations are being encouraged to ensure that their mission critical workloads are secure and portable which helps ensure operational resilience and business continuity should something go wrong.

For example, the Bank of England's Prudential Regulation Authority, which succeeded the Financial Services Authority, has expressed concern over banks' reliance on cloud computing from a single vendor. It recommends that Financial Services Institutions adopt multi-cloud architectures to spread risk and avoid vendor lock in, whilst the Digital Operational Resilience Act (DORA) in the EU advocates a similar approach.

Consequently, banks are under pressure to share their workloads around and have backup cloud providers that they can switch to in the event of an outage or a breakdown in the relationship.

Protecting digital sovereignty in practice

Thales has made a business of supporting these various sovereignty requirements. It uses a four-step process to get its clients onto a positive footing where they feel completely in control of their own data, operations, and software.

The company begins with a process of discovery. You can't control what you don't know about, after all. So it seeks to answer the questions: Where is your data and what is it? How sensitive is it?

Many organisations don't know these basic facts, says Phipps. They're generating new data all the time, and it's no longer simply a case of classifying database records in known fields. "Increasingly, a lot of the sensitive data that's been generated is unstructured and appears in random locations," he posits. Increased cloud adoption and modern remote work policies have contributed to 2.5 quintillion bytes of new data being generated every day in emails, presentations and spreadsheets according to some estimates. Without outside help, it's much harder to track the location of the sensitive information those vast volumes contain.

To that end, Thales developed its Data Discovery and Classification (DDC) solution, which scans for specific data types according to compliance models in which the organisation is interested. DDC uses machine learning algorithms, and a reference library of pre-defined data privacy and regulatory templates such as GDPR, CCPA, LGPD, PCI DSS and HIPAA, to find sensitive data of interest and apply a risk score based on the client's policies and compliance. DDC can then recommend manual remediation or apply it automatically which saves time and helps minimise the attack surface.

Protection through multi-layered encryption

What does this remediation look like? This is where the second step of Thales' methodology - protection - comes in. This focuses mainly on multi-layered encryption, which it splits into three types: data at rest, in transit, and in use.

As Phipps points out, all major cloud providers encrypt data at rest by default. However, there's a caveat: many of them only encrypt it at the disk level. That may stop someone retrieving the data in the unlikely event that they steal a physical disk from the cloud data centre, but what if they hijack someone's account remotely? Few, if any, cloud providers automatically encrypt data above the disk level such as at the file, database, or application level, which can help mitigate this threat.

Thales provides solutions to encrypt data at multiple levels, including structured and unstructured data, to achieve defence in depth. Transparent encryption at the file level protects the entire database for around a two percent performance overhead, Phipps says. Clients can apply higher levels of encryption to specific fields in the database should they wish.

While that imposes an extra performance overhead, it also provides heightened levels of protection where needed. Security, compliance and performance are often subject to trade-offs. The key is to adopt a risk-based approach to ensure that the protection appropriately applied is based on desired business outcomes.

When it comes to in-transit encryption, TLS is the de facto standard. However, Phipps argues that this can often struggle under the huge volumes of data that some companies process in the cloud. Instead, Thales offers high-speed encryption in its Network Encryptor products, which Phipps says are faster than TLS and offer a higher degree of protection.

Thales also focuses on encryption of data in use, which fights tampering or snooping during cloud processing. "We've been speaking to some critical infrastructure providers like energy companies, and they're worried about running sensitive workloads in the cloud for safety critical applications," Phipps says. "If somebody injects some malware into a chip that is processing data, they could effectively perform a denial of access and take the whole system down."

To combat this, cloud providers are working on various confidential computing initiatives. In these services, a portion of the chip becomes a secure enclave under the customer's control rather than the cloud service provider's. Microsoft Azure, Google Cloud and AWS all have offerings here.

Maintaining data sovereignty in the cloud

The third strut of Thales' digital sovereignty service is control. Leaving those encryption keys in the cloud theoretically puts them under the cloud service provider's control, which is a clear threat to the customer's sovereignty. This renders the customer vulnerable to malicious behaviour or mistakes from a third party such as the cloud service provider's own support engineers. It also puts the data at risk in the event of a subpoena from a foreign state.

The solution to this threat lies in the separation of duties. Creating and storing encryption keys outside the cloud, separating them from where the sensitive data is stored, gives the customer ultimate control over the data. Should a threat to the data arise, the customer can withhold the keys. Because the cloud service provider cannot unilaterally access those keys, they cannot be compelled to hand over the data to a third party.

Some cloud providers have introduced services that allow customers to store their own keys for cloud-based workloads, creating a clear segregation of duties between cloud service provider and customer. Google Cloud's External Key Manager (EKM) is a case in point.

Thales has been working with Google Cloud to help customers manage control of their data, operations, and software while still enjoying the benefits of cloud computing. In December 2020, they worked to integrate Thales' CipherTrust Key Broker service with Google Cloud's EKM. This enables customers to generate their encryption keys for Google's cloud service while keeping the keys outside the Google Cloud environment.

Since then, the two companies have expanded the partnership to cover other services. In June 2021, CipherTrust Manager and Thales' SafeNet Trusted Access product were integrated to support client-side encryption for Google's Workspace service for example. This lets organisations encrypt Google Drive data using their own keys.

In the last year, they've also collaborated on a cloud-based platform that complies with the French government's Trusted Cloud label. This requires cloud service providers to host their servers in France and allow only European companies to operate them, while limiting data transfers to other countries. In addition they're developing a Trusted Cloud-compliant service, scheduled for release in 2024. In the meantime, Thales majority owned joint venture with Google Cloud, S3NS, is already enabling Google Cloud customers in France to restrict access to EU locations, with the help of its own key management services.

Unifying cloud operations and key management while keeping keys under customer control solves one of the biggest problems in cloud security: the complexity of key management. The Thales survey found 57 percent of companies using at least five separate key management solutions, increasing the complexity and cost of managing data encryption. Aggregating and simplifying this key management will become steadily more critical as companies manage sensitive data in an increasingly distributed environment.

Monitoring digital sovereignty over time

As companies continue to expand their digital assets across complex multi-cloud and hybrid environments, they need a way to maintain visibility. This is where the final part of Thales' digital sovereignty process comes into play: monitoring. The company's CipherTrust platform, currently available as an on-premise product but soon to be launched as a service, provides a single pane of glass view of digital sovereignty across all of their tools and processes in multi-cloud and hybrid cloud environments. The system provides access to a range of third-party products alongside Thales' DDC.

Digital sovereignty principles and practices will only become even more complex over time, says Phipps. That's why he emphasises the benefits of building in privacy by design into hybrid multi-cloud architectures. Phipps also advocates the need for building human relationships based on trust and empathy that seek to maximize the customer experience.

"Technology on its own without the right partnerships at an advisory and supplier level probably won't make it easier for the customer to understand how to move forward," he concludes. This is a difficult puzzle to unravel, and it takes third-party expertise, and a partnership approach, to do it properly.

Sponsored by Thales.

See original here:

Simplifying digital sovereignty in a multi-cloud world - The Register

Categories
Cloud Hosting

The Global IT Services Market size is expected to reach $2,013.6 billion by 2028, rising at a market growth of 8.4% CAGR during the forecast period -…

ReportLinker

In addition to assisting with other business operations, corporations employ information technology (IT) services to create, manage, and deliver information. Services include hardware deployment, training, consulting, software development, and systems integration.

New York, Dec. 21, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Global IT Services Market Size, Share & Industry Trends Analysis Report By Type, By Enterprise Size, By Industry, By Regional Outlook and Forecast, 2022 - 2028" - https://www.reportlinker.com/p06374139/?utm_source=GNW Outsourcing, managed services, security services, data management, and cloud computing are some of the areas that make up the overall market for IT services.

Generally, an industry companys profitability depends on its capacity to develop its technical know-how and improve its services. Both smaller and larger businesses can compete in this sector. While larger businesses tend to offer a wider range of services and have a global presence, smaller businesses typically target specialized markets and cater their products more closely to the demands of their target audiences.

Cloud computing, which stimulates IT-related innovation, is the section of IT services that are developing at the fastest rate. Some IT services, such as hardware installation, support, and maintenance, are frequently outsourced on demand since they frequently involve replacing or repairing out-of-date equipment.

Consumer hardware will be properly disposed of by a qualified provider, who will also format hard drives to remove all potentially sensitive data. Computers, hard drives, printers, modems, and routers that the company needs to run smoothly are typically installed by IT departments. On the other hand, there is a constant need for repairs anywhere there is a ton of hardware equipment. Error troubleshooting is a part of the repair.

COVID-19 Impact Analysis

In recent years, the interconnection of the global economy has increased substantially. Indicators of the negative effects of various containment-related actions such as Disruptions in the global supply chain, a decline in demand for imported products and services, and an increase in the unemployment rate. The financial market has grown more volatile as a result of historically low-interest rates, huge decreases in equity and commodity prices, and heightened risk aversion. Most organizations have stabilized and enhanced their infrastructure to improve operations and facilitate operations during the pandemic.

Market Growth Factors

Demand for cloud-based it services is growing

The majority of businesses and industries have replaced on-premises software with cloud-based software. The cloud-based software enables access to all enterprise applications at a reasonable cost and without requiring a significant initial investment in software or hardware. Similarly, the adoption of cloud computing facilitates the expansion and contraction of commercial operations. Consequently, cloud-based IT services have become a more advantageous and cost-effective option for SMBs in recent years. In addition, cloud computing provides SMBs with new business possibilities and opportunities.

Increased Investment Return With Reduced Infrastructure And Storage Costs

The initial implementation and ongoing costs of hosting data on-premises are a problem for businesses. In addition, labor expenses and downtime difficulties are additional challenges for businesses. Existing rivalry and global economic conditions have hastened the adoption of cost-effective business model restructuring strategies. Increasing enterprise embrace of digital transformation and speeding customer experience are other drivers driving the growth of cloud computing services, which eventually reduce enterprise costs.

Market Restraining Factors

Insufficient Standardization

The effectiveness of IT services in businesses depends on a variety of criteria and varies significantly between firms. Similarly, each organization is unique and, as a result, utilizes specialized technologies to satisfy its distinct business needs. Due to the lack of standardization, it is difficult for businesses to assess the viability of IT services based on the success rate of the same technology in another organization. A typical IT service setup may cost up to $75 to $300 per user. Insufficient IT services may therefore place a significant cost strain on enterprises.

Type Outlook

Based on the Type, the IT Services Market is segmented into Security Outsourcing, IT Support, Managed Security Services, Systems and Network Implementations, and Security Strategy and Planning. The security outsourcing segment witnessed a significant revenue share in the IT services market in 2021. The prevalence of cyberattacks, which is constantly rising, is a major worry for business owners everywhere. In-house IT security is more expensive than security outsourcing. As a result, small and medium-sized organizations typically choose to contract out their security needs.

Industry Outlook

On the basis of Industry, the IT Services market is segmented into BFSI, Telecommunication, Healthcare, Retail, Manufacturing, Government, and Others. The Telecommunication segment recorded a substantial revenue share in the IT services market in 2021. Communication service providers were confronted with pertinent issues associated with the optimization of an existing company and the hunt for new niches to offer innovative services. The market shifts fundamentally owing to alterations in the nature and methods of providing new services.

Organization Size Outlook

By organization size, the IT Service Market is divided into Large Enterprises and Small & Medium Enterprises. The small & medium enterprises segment registered a significant revenue share in the IT services market in 2021. It improves small businesses competitiveness, operational efficiency, and growth. Internationally, company executives are placing a larger focus on the adoption of information technology to develop dynamic capabilities.

Regional Outlook

Region-wise, the IT Services Market is analyzed across North America, Europe, Asia Pacific, and LAMEA. The Asia pacific segment acquired a promising growth rate in the IT services market in 2021. The increase in IT service spending by businesses in the Asia Pacific region coincides with COVID-19 accelerating digital transformation and the shift to the cloud and a record-high level of demand for IT services globally. The majority of businesses in the Asia Pacific region have expanded their investments in cutting-edge technology to quickly implement and respond to market changes.

The major strategies followed by the market participants are Partnerships. Based on the Analysis presented in the Cardinal matrix; Accenture PLC and IBM Corporation are the forerunners in the IT Services Market. Companies such as Tata Consultancy Services Ltd., Infosys Limited and Capgemini SE are some of the key innovators in IT Services Market.

The market research report covers the analysis of key stake holders of the market. Key companies profiled in the report include Microsoft Corporation, Accenture PLC, Tata Consultancy Services Ltd., Cognizant Technology Solutions Corporation, Wipro Limited, HCL Technologies Ltd. (HCL Enterprises), Capgemini SE, IBM Corporation, Infosys Limited, DXC Technology Company, and NTT Data Corporation.

Recent strategies deployed in IT Services Market

Partnerships, Collaborations & Agreements

Oct-2022: DXC Technology formed a partnership with Dynatrace, which provides a software intelligence platform based on artificial intelligence and automation. Under this partnership, Dynatrace Software Intelligence Platform would become the selected DXC Platform X software for monitoring and artificial intelligence-powered automated management of a consumers IT estate. Moreover, the addition of Dynatrace, with its cooperative and smart view across software products and technologies, helps reinforce predictive AIOps abilities and drive cost optimization.

Oct-2022: Capgemini joined hands with Microsoft Corporation, an American multinational technology corporation. Together, the companies aimed to provide a first-of-its-kind, serverless, cloud-native, Azure-based digital twin platform, known as ReflectIoD. Additionally, ReflectIoD is a safe, highly scalable platform that would utilize best-in-class architecture and technological features from the Azure portfolio to assist transform an organizations procedures and maintenance effectiveness, allowing intelligent industry and operating sustainable business value.

Oct-2022: Accenture partnered with Google Cloud, an American multinational technology company. This partnership aimed to increase their respective talent, increase their joint abilities, create new solutions utilizing data and AI, and deliver improved support to help customers build a strong digital core and reimagine their companies on the cloud. Moreover, Cloud presents boundless opportunities for companies to be more creative and resilient.

May-2022: NTT DATA joined hands with NTT, a completely owned subsidiary of Nippon Telegraph and Telephone Corporation. Through this collaboration, the companies aimed to integrate their system connectivity capabilities with NTT Ltd.s Edge to Cloud service operation abilities. Moreover, the integration would allow the business to combine IT services and Connectivity and react to increasingly complicated and various client needs on a global level by centrally creating a service offering necessary for digital transformation.

May-2022: IBM joined hands with Amazon Web Services, which provides on-demand cloud computing platforms. Through this collaboration, the companies aimed to provide IBM Software-as-a-Service on AWS. However, the presence of the IBM SaaS portfolio on AWS would permit businesses to concentrate on providing clients value without concern about IT infrastructure management, allowing innovation at a faster clip.

Mar-2022: Cognizant joined hands with Microsoft, a technology corporation creating computer software. Through this collaboration, Cognizant aimed to improve its Multiphase Solutions Rollout, helping its expanding healthcare practice and capability to modernize payers and suppliers with digital abilities.

Mar-2022: HCL Technologies signed an agreement with NEORIS, a global digital accelerator that co-creates disruptive solutions. This agreement would bring special abilities to customers in global markets, including the capabilities to improve application utilization time, business management operations, and combined IT services. Additionally, the companies aimed to boost up digital transformation, reduce risks, assign teams based on product development, develop a zero-incident culture and save expenses.

Feb-2022: HCL Technologies partnered with VMware, an American cloud computing and virtualization technology company. This partnership would expand its Cloud Smart portfolio of services powered by VMware technology to include support for VMware Telco Cloud 5G Core and VMware Telco Cloud RAN. Moreover, this collaboration would provide combined solutions for service providers across the world.

Dec-2021: NTT DATA teamed up with AWS, a subsidiary of Amazon. Together the companies aimed to utilize their proven track record and high delivery abilities in Japan to establish a supreme position in digital business, advance their enterprise on a global scale, and contribute to business development for their clients. Moreover, NTT DATA would reinforce a framework to support customers that consider shifting their IT infrastructure to the cloud.

Nov-2021: Wipro Limited came into a partnership with TEOCO, a privately owned telecom software vendor. Through this partnership, the companies aimed to create solutions that help communication service providers (CSPs) enhance network automation, efficiency, flexibility, and dependability. Moreover, TEOCO would help CSPs create a cooperative process to ensure service quality, network performance, and defect management, eventually allowing the rapid adoption of next-generation services.

Dec-2021: Wipro partnered with HERE Technologies, the location data, and the technology platform. This partnership would deliver location-based services, to customers from Telecom, Energy & Utilities, Transport & Logistics, Manufacturing, and Automotive industry verticals.

Nov-2020: Tata Consultancy Services came into a partnership with Zoho, a developer of web-based business tools. This partnership aimed to deliver premium IT Service Management, Customer Relationship Management, and e-Commerce solutions to solve issues for large companies. Additionally, Partnership would deliver end-to-end business solutions to global businesses and mid-market enterprises.

Jun-2020: NTT DATA joined hands with Microsoft, an American multinational technology corporation. Under this collaboration, companies aimed to combine NTT DATAs best-in-class global IT services with Microsofts authorized cloud platform, AI technologies, and offering of productivity tools aimed at supporting business digitally transform, growing efficiencies of business productivity and procedures.

Mergers & Acquisition

Oct-2022: Accenture completed the acquisition of Stellantis, World Class Manufacturing Training & Consulting Business. This acquisition would reinforce Accentures abilities in business process optimization. Additionally, the acquisition permits Accenture to combine the World Class Manufacturing (WCM) process in its solutions that permit clients to convert their manufacturing process and supply chain networks to be more endurable, efficient, and robust.

Sep-2022: Accenture took over The Beacon Group, a growth strategy consulting firm serving Fortune 500 corporations. This acquisition would enhance Accentures abilities that permit C-suite leaders to make fact-based conclusions for segmentation, targeting, and ways to growth powered by market insights and flexible solutions to manage enterprise transformations at scale.

Sep-2022: IBM completed the acquisition of Dialexa, a foremost digital product engineering consulting service. This acquisition would improve IBMs hybrid cloud and AI abilities, and boost growth for customers. However, the acquisition is expected to enhance IBMs product engineering expertise and provide end-to-end digitalization services for consumers.

Apr-2022: NTT DATA completed the acquisition of Business Services and Technologies OOD, one of the foremost SAP service providers. This acquisition aimed to improve flexibility and scalability in international shoring systems in consulting and managed services and highlight the claim of multinational capacities, and local proximity. Under this acquisition, NTT is developing its shoring portfolio in the European Union.

Feb-2022: IBM completed the acquisition of Neudesic, a foremost U.S. cloud services consultancy. This acquisition would significantly expand IBMs offering of hybrid multi-cloud services and further enhance the companys hybrid cloud and AI strategy. Moreover, Neudesic brings deep Azure data engineering, cloud, and data analytics expertise to boost customers hybrid cloud journeys.

Mar-2021: Accenture acquired REPL Group, a U.K.-based technology consultancy. This acquisition aimed to expand Accentures abilities that help customers across retail and adjacent industries transform their supply chains and procedures and provide seamless consumer and employee experiences. Moreover, REPL utilizes its deep retail expertise, along with cutting-edge technology skills, to support global businesses and provide sustainable value.

Mar-2020: Infosys completed the acquisition of Simplus, one of the fastest-growing Salesforce Platinum Partners. Under this acquisition, Infosys further advances its standing as an end-to-end Salesforce enterprise cloud services and solutions provider, delivering clients unparalleled abilities for cloud-first digital transformation.

Product Launches & Product Expansion

Mar-2022: HCL Technologies introduced Quality of Experience (QoE) and Energy Savings applications, new 5G applications to help mobile network operators optimize the customer experience. The applications improve network performance in places with high traffic congestion, such as city centers and large sporting events, HCLs QoE application enables mobile network operators to deliver seamless, fast, and dependable 5G services, by utilizing artificial intelligence (AI). Additionally, HCLs Energy Savings application decreases the operating costs of providing 5G, utilizing AI-based network automation abilities.

Dec-2021: Wipro introduced VisionEDGE Solution, a digital signage and omnichannel ad solution. The Wipro VisionEDGE delivers a centralized platform for innovation and enables brands to control and stream content to heighten consumer attention. Moreover, Wipro VisionEdge would utilize the potential of Wipro FullStride Cloud Services to provide customers the capability to unlock new enterprise value from their brand belongings and develop new income streams.

Apr-2021: IBM introduced revamped model ESS 5000, the IBM Elastic Storage System (ESS) family of high-performance solutions that are highly flexible and developed for easy deployment. The ESS 5000 provides 10% greater storage ability and the new ESS 3200 which delivers dual the read performance of its prototype.

Scope of the Study

Market Segments covered in the Report:

By Type

Managed Security Services

Security Outsourcing

IT Support

Systems & Network Implementations

Security Strategy & Planning

By Enterprise Size

Large Enterprises

Small & Medium Enterprises

By Industry

BFSI

Telecommunication

Healthcare

Retail

Manufacturing

Government

Others

By Geography

North America

o US

o Canada

o Mexico

o Rest of North America

Europe

o Germany

o UK

o France

o Russia

o Spain

o Italy

o Rest of Europe

Asia Pacific

o China

o Japan

o India

o South Korea

o Singapore

o Malaysia

o Rest of Asia Pacific

LAMEA

o Brazil

o Argentina

o UAE

o Saudi Arabia

o South Africa

Visit link:

The Global IT Services Market size is expected to reach $2,013.6 billion by 2028, rising at a market growth of 8.4% CAGR during the forecast period -...

Categories
Cloud Hosting

St. Cloud hockey games scheduled in honor of player killed in crash – SC Times

ST. CLOUD St. Cloud Crush Hockey will be hosting two hockey games in remembrance of Charlie Boike, a 17-year-old Technical High School student who died in a car accident Dec. 10 on his way home from his high school hockey game.

The games will take place on Dec. 30 and "will transform the annual battle between cross-town rivals St. Cloud Crush and St. Cloud Cathedral into a celebration of Charlies impact on both teams, their families, and the community," said the St. Cloud Crush Boys Hockey Booster Club in a press release Thursday.

A junior varsity game will start at 3 p.m. and the varsity game will start at 7 p.m. at the St. Cloud Municipal Athletic Complex.

There is no better way to celebrate Charlies approach to life than his friends and teammates competing in the arena that houses so many fond memories, said St. Cloud Youth Hockey Association President Jared Smith.

More:Tech High School hockey player dies in rollover crash

Community members are asked to wear white clothing to the game in celebration of Boike's life and impact. The first 1,000 attendees will receive a commemorative t-shirt at the door.

You can buy tickets to the games at the door or online at http://www.stcloudmac.com.

A GoFundMe started to support the Boike family surpassed $57,500 as of Thursday morning, exceeding the family's initial $10,000 goal. According to a post on the site, the family is planning to create a Charlie Boike Memorial Fund that will help others play hockey, whether it be with equipment, registration costs, team fees or more.

"They hope to help as many kids experience the sport that meant so much to Charlie and so much to the entire Boike family," wrote organizer Amber Hedin, speaking on behalf of the family. "They want to have Charlie live on (through) so many as they take the ice."

Becca Most is a cities reporter with the St. Cloud Times. Reach her at bmost@stcloudtimes.com. Follow her on Twitter at@becca_most.

Support local journalism. Subscribe to sctimes.com today.

More:

St. Cloud hockey games scheduled in honor of player killed in crash - SC Times

Categories
Cloud Hosting

2 Metaverse Stocks That Could Make You Richer in 2023 – The Motley Fool

It wasn't all that long ago that investors were excited about the potential for the metaverse. Some analysts were tossing around projections that the market for the virtual world could eventually be in the trillions of dollars. The stocks of companies with major metaverse initiatives were on fire.

That was then. Today, there isn't nearly as much talk about the metaverse. Most of those once-sizzling stocks have flamed out.

But don't think for a second that the metaverse opportunity isn't significant. There's still a lot of money to be made -- by companies and investors.Here are two metaverse stocks that could make you richer in 2023.

You might not think ofAmazon (AMZN 1.74%) as a metaverse stock. After all, the company's head of devices Dave Limp said earlier this year that Amazon is focused on "the real world" and not the metaverse.

However, there's more to the story. In February, Amazon posted a job opening for a product manager on its Amazon Web Services (AWS) team who "will own the delivery of cloud-based metaverse services." This job posting shows that the company actually is more focused on the metaverse than you might think.

This shouldn't be surprising. AWS reigns as the leading provider of cloud hosting services. The metaverse will present a massive growth opportunity for the cloud hosting market.

Amazon would be crazy to ignore the metaverse. And it isn't crazy.

I predict that Amazon will ultimately be among the big winners if and when the vision of the metaverse becomes a reality. Before that happens, though, the stock could (and will, in my view) make investors money in 2023 for a completely different reason.

Amazon stock is historically cheap after plunging close to 50% this year. Much of this decline is related to macroeconomic factors, rather than anything specific to the company. I'm not in the camp that believes a new bull market is imminent. But I do think it's quite possible that we could see a big stock market rebound in the second half of next year.

Meta Platforms (META 0.79%) provided the main catalyst for investors' initial enthusiasm about the metaverse opportunity. The company even changed its name from Facebook to reflect its big focus on it.

That metaverse pivot appears to have backfired badly. Meta stock has plunged close to 65% this year. CEO Mark Zuckerberg has been raked over the coals by some investors for spending too much money on what they view as a quixotic dream.

It's true that Meta's Reality Labs, the home to its metaverse initiatives, is burning through cash. However, Zuckerberg made a pretty good case for this spending in an interview earlier this month.

He noted that 90% of Reality Labs' research and development investments are going toward virtual-reality headsets and augmented-reality glasses. Those efforts could pay off handsomely, even if Zuckerberg's metaverse vision isn't achieved.

The metaverse won't be a moneymaker for Meta anytime soon. But the company's social networking apps continue to generate enormous revenue. In the third quarter alone, Meta raked in $27.7 billion in sales with profits of nearly $4.4 billion. Although both numbers reflected year-over-year declines, that's still a lot of money.

Like Amazon, Meta is feeling the impact of macroeconomic headwinds. Advertising spending has fallen. However, if a recession is avoided (as Goldman Sachsexpects), the ad market could regain momentum next year. Even if a recession comes, many economists predict that it will be short and mild.

Importantly, Meta's user base continues to grow. In Q3, the numbers of daily and monthly active users across its family of social apps both increased by 4% year over year to 2.93 billion and 3.71 billion, respectively.

The company is also rolling out new products that should retain existing users and attract new ones. It's retooled its feeds to focus more on content curated by artificial intelligence (AI). And Meta is working hard to increase monetization, as well, especially with its WhatsApp messaging app.

Meanwhile, Meta stock is attractively valued. Shares trade at only 14.5 times expected earnings. Aswath Damodaran, the NYU finance professor who literally wrote the book on valuing companies (actually, he's written several of them), believes that the stock has practically all upside potential based on its current valuation.

With all of this in mind, Meta looks like a stock that could very well make investors richer in 2023. If its monetization and metaverse efforts pay off, that increased wealth could be multiplied over the long term.

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. Keith Speights has positions in Amazon.com and Meta Platforms. The Motley Fool has positions in and recommends Amazon.com, Goldman Sachs Group, and Meta Platforms. The Motley Fool has a disclosure policy.

More:

2 Metaverse Stocks That Could Make You Richer in 2023 - The Motley Fool