Categories
Cloud Hosting

Questioning the Cloud Value Proposition | No Jitter – No Jitter

According to Wall Street and the quarterly reports filed by many top technology companies, including Amazon, Google, and Microsoft, cloud growth decelerated in the quarter that just ended. The growth of cloud computing has slowed a bit this year, The Wall Street Journal admitted, and most tech publications sing the same tune. Some people are now questioning whether the cloud was cheaper, as was claimed, than the data center. It may not be time to abandon the cloud, but it sure sounds like we need to look at it more closely.

Lets start with a basic truth. If youre an enterprise with a data center that contains hundreds of rack-mounted servers, and youre looking at the price of running an application instance in a virtual machine in your data center versus in the cloud, the data center will be cheaper. You can get almost the same economies of scale as an enterprise as a cloud provider could achieve, and you dont have to pay the cloud providers profit margins.

So, the cloud is a massive fraud? Not so fast. If you have an application, not a single instance, that has to support consumer access via the Internet, and you want to host the application somewhere, that same cloud is almost certainly going to be cheaper. Why? Because the application will need to scale to the maximum number of users expected, and providing that scalability in your own data center means buying a bunch of servers that wont be used most of the time. The clouds scalability is based on resources shared across all its users, so its more efficient.

There is a legitimate reason to use cloud resources. The current trends toward direct online product information and sales support encourage the building of applications that have to scale significantly and so are reasonable candidates for public cloud hosting. Whats not legitimate is the often-expressed view that everything is moving to the cloud. What we are seeing now is proof thats not the case.

Suppose the cloud was really cheaper, and that everything in the data center could run in the cloud at a lower cost. Here we are in an economic downturn, and yet companies are decelerating their cloud adoption. Forget savings, they say, and toss out that cheaper cloud option! Nonsense. Clearly, a cheaper cloud would be adopted more, not less, often. But suppose that the clouds growth depends not on overall cheapness, but on enterprises shifting to an online marketing/sales model. If were in that same economic downturn, wouldnt it be smart to wait till things look better before we start our new online program? The result is a slowing of cloud growth.

Or suppose youre a social media company whose revenues come from ads. Do your advertisers, seeing the threat of an economic downturn, decide to up their spending, so you can up your cloud spending? Doubtful, so what really happens is that they cut their spending, and you cut yours. Cloud providers revenues dont grow as fast as a result.

The signals from the market are clear; cloud spending depends on consumers online activity. Its the variation in these consumer workloads that makes an application a candidate for the cloud. If lower overall cloud costs were driving adoption, the same companies would accelerate a movement to the cloud. In fact, wed have converted to the cloud long ago.

OK, so everything isnt moving to the cloud. Whats the action item? For IT and network planners, including CIOs, the truth about the cloud raises two questions. First, what actually happens to applications that, instead of being run either in the cloud or data center, must now be run with a foot in both worlds? Second, if the role of the WAN is to connect the cloud piece of applications to the data center piece, what does the network of the future actually look like?

The questions here are as important as their answers. What theyre showing is that the network of the future is being framed by application software decisions rather than by things like the location of workers or facilities. As the cloud becomes the universal on-ramp for applications, how the cloud relates to workers and relates to the data center (outwards to the workers and inwards to the data center) establishes network policy.

What seems inevitable is that nearly all application access will move to the cloud, that mission-critical transaction processing and databases will remain in the data center, and that the Internet on the outside, and cloud-to-data-center connections on the inside will be the WAN of the future. In fact, the current cloud evolution could have a major impact on MPLS VPN services by siphoning traffic off VPNs and onto the Internet.

Is the basic value of the cloud under threat? No, the coverage of the cloud is whats questionable. Weve accepted facile views of cloud benefits that never had any basis, and were now struggling to understand the real benefits of the cloud. Cloud benefits are more complicated, so they take more than a few soundbites to communicate. The moral here is that the cloud, as it gets more pervasive, mature, and advanced, also gets more complicated, and we have to accept it will take some effort to deal with that. Once we do, then the real future of the cloud will become clear.

Go here to see the original:

Questioning the Cloud Value Proposition | No Jitter - No Jitter

Categories
Cloud Hosting

5 things you need to know about re:Invent, AWS’s biggest cloud event of the year – About Amazon

Thousands of cloud-computing fans will convene in Las Vegas from November 28 through December 2 for the Amazon Web Services (AWS) 11th re:Invent cloud conference. Each year, re:Invent features leader keynotes, new service announcements, fun, and inspiration. Here are five things to know about this years re:Invent:

re:Invent is AWSs biggest cloud event of the year, and an opportunity to check out all the latest news and developments in compute, databases, analytics, machine learning, and storage. You can follow all the key announcements, and get a peek at the newest cloud technologies from AWS at the Amazon Press Center.

After going virtual in 2020, and offering a hybrid event in 2021, re:Invent is offering more in-person events so all attendeesfrom customers to partners to aspiring technologistscan learn from experts and each other. The team has designed an in-person program with nearly 2,300 sessions. To ensure the conference is as inclusive as possible, AWS is live-streaming all keynotes and leadership sessions for virtual attendees. All presentations, including breakout sessions, will be captioned and broadcasted. To catch the re:Invent action remotely, register to virtually attend.

AWS leaders, including CEO Adam Selipsky, will host keynotes, announcing the latest product launches and sharing inspiring customer stories. Other keynote speakers include Amazon Chief Technology Officer Dr. Werner Vogels, Senior Vice President of AWS Utility Computing Peter DeSantis, Vice President of AWS Worldwide Channels and Alliances Ruba Borno, and Vice President of AWS Database, Analytics, and Machine Learning Swami Sivasubramanian. In total, re:Invent will feature 22 leadership sessions. Check out the full agenda.

At its heart, re:Invent is a learning conference, offering builder labs, bootcamps, gamified learning, and hundreds of technical sessions, from the introductory to the most advanced. Attendees get to dive deep with new technologies, and they can practice new ways of working and hone their skills alongside their cloud-community peers.

Participants will have opportunities to explore demos and interact with technology, including meeting a robot bar-keep, seeing a basketball free-throw analyzer, and playing a cloud-skills game. AWS is also hosting a showcase for sustainability, which highlights how technology is being used to address challenges like water conservation and decarbonizing operations. And the AWS Disaster Response rolling laba technology-packed truckshows the benefits of cloud capabilities during disaster responses. Finally, attendees can catch the annual AWS DeepRacer League Championship, where machine learning meets model cars racing autonomously around a tough track.

More:

5 things you need to know about re:Invent, AWS's biggest cloud event of the year - About Amazon

Categories
Cloud Hosting

CloudWave acquires Sensato to expand its healthcare cybersecurity portfolio – Help Net Security

CloudWave acquires Sensato Cybersecurity, bringing together cloud hosting services and managed Cybersecurity-as-a-Service for healthcare organizations.

Sensato was founded by long-time health information technology visionary John Gomez, who will join CloudWave as chief security and engineering officer.

Sensato developed a fully integrated Cybersecurity-as-a-Service platform (CaaS) that features an innovative solution stack to provide real-time network monitoring, intrusion detection, and asset fingerprinting along with a 247 Security Operations Center designed specifically for healthcare infrastructure and connected devices. It will be available immediately as part of CloudWaves new Sensato Cybersecurity suite.

As healthcare organizations are increasingly being targeted by cybercriminals, CloudWaves Sensato Cybersecurity suite provides a level of security that combines the ability to comply with best practices and regulations, detect threats, and respond to cybersecurity incidents in a fully integrated and easy to deploy holistic platform.

The Sensato Cybersecurity suite is a natural fit with CloudWaves OpSus Cloud Services. It will enable hospitals to implement a fully managed cybersecurity program, resulting in full HIPAA and NIST compliance, with end-to-end service and support from a single provider.

The companys Cybersecurity Tactical Operations Center (CTOC), a next generation SOC, employs a tactical approach to cybersecurity with continuous monitoring by cybersecurity analysts, while also incorporating machine learning.

The blending of operations for cybersecurity and cloud service delivery into a single extended healthcare ecosystem, spanning public cloud, private cloud and on-premises healthcare technology environments is unprecedented and provides a seamless, enhanced experience to customers.

As a managed cloud service provider to hundreds of hospitals, CloudWave has witnessed how devastating a cyber event can be to healthcare organizations. Unfortunately, the frequency of these attacks continues to risean increasing number of customers have called upon CloudWave for rapid response services to help halt and remediate cyberattacks on their on-premises systems in just the last two years, said Erik Littlejohn, president and CEO of CloudWave.

With the addition of the innovative, proprietary technologies included in the Sensato Cybersecurity suite, along with the cyber expertise of the Sensato team, CloudWave will be able to offer customers the high-level cybersecurity we provide for our cloud-based delivery to on-premises systems., Littlejohn continued.

Rich Temple, vice president, CIO at Deborah Heart and Lung Center in Browns Mills, NJ, commented, The Sensato Cybersecurity suite offers a level of healthcare data protection to hospitals that would be difficult to implement on our own. Theres peace of mind in knowing that our systems are being monitored and protected by an experienced team of cybersecurity specialists.

He continued, As a long-time CloudWave customer, we know that having a single partner for services delivery of our cloud-hosted applications and our on-premises cybersecurity systems and support simplifies our IT operations significantly.

CloudWave is the independent cloud and managed EHR hosting provider in healthcare and has been delivering secure IT services to the healthcare market via the cloud for more than a decade. The company is 100% focused on healthcare with more than 250 hospital environments currently managed in the public cloud and the OpSus private cloud.

CloudWave also employs a defense-in-depth approach to cybersecurity. Its Sensato Cybersecurity suite and CTOC further fortifies the OpSus Cloud Services platform and provides complete, managed cybersecurity as a service to customers.

Were excited to become part of the CloudWave family, said John Gomez. The promise of what well be able to collaboratively provide to the healthcare community with our combined expertise securing and delivering IT services is beyond measure.

The terms of the deal were not disclosed.

Follow this link:

CloudWave acquires Sensato to expand its healthcare cybersecurity portfolio - Help Net Security

Categories
Cloud Hosting

GoZone WiFi and Forum Info-Tech Provide Toyota Arena and Ontario Convention Center with Innovative Portfolio of IT Solutions to Self-Manage, Automate…

Stadiums and Convention Center Venues Benefit from New Ways to Monetize and Differentiate their Facilities

ST. PETERSBURG, Fla. & CORONA, Calif., November 29, 2022--(BUSINESS WIRE)--GoZone WiFi, the leader in WiFi monetization and management tools, and Forum Info-Tech, a top 100 managed service provider (MSP), today announced that the two companies are working together to provide the Toyota Arena and Ontario Convention Center with advanced IT and WiFi solutions. Toyota Arena is a multi-purpose arena in Ontario, CA, hosting local sporting events and concerts including the Ontario Clippers, and the Ontario Convention Center is a full-service, state-of-the-art convention facility used for conventions, trade shows, exhibits and meetings.

GoZone WiFi and Forum Info-Tech are helping to drive revenue and connect with Wi-Fi users that exhibit or attend events at large facilities like the Toyota Arena and Ontario Convention Center using GoZones Smart WiFi Suite of guest analytics solutions and Forum Info-Techs IT products and services. These combined solutions offer these venues with updated IT services and the ability to self-manage and automate the onboarding of any type of user including exhibitors, guests, fans, contractors, and all their IoT devices.

"We are excited to partner with Forum Info-Tech to provide Toyota Arena with our combined innovative guest analytic and IT solutions," said Todd Myers, Founder and CEO, GoZone WiFi. "GoZones solutions for stadium and Convention Center facilities like Toyota Arena and Ontario Convention Center provide these venues with the opportunity to increase concession revenues, drive future attendance and make informed staffing decisions. In addition, these venues can better understand traffic flow and attendee behavior and apply that data to event layout and design. They also can showcase event sponsors with interactive ads, drive app downloads, and provide advertiser attribution."

Story continues

"GoZone Wi-Fi helps Forum Info-Tech add an additional layer of service for our Managed IT Service venues like the Toyota Arena and Ontario Convention Center. In addition to providing onsite and co-managed IT services for their computer networks, workstation support, and cloud hosting and security compliance, we have found that GoZone Wi-Fi provides Forum Info-Tech the logical extension of taking care of the venues Wi-Fi to provide an incredible and hassle-free WiFi experience for both the venue and those attending events," added Biren Shukla, President and CEO, Forum Info-Tech.

About GoZone:

GoZone WiFi is a SaaS company and leader in monetizing and managing Guest WiFi. The company offers business analytics, venue intelligence and guest engagement by using WiFi networks to deliver branded content, provide customer analytics, and display advertising. GoZones Smart WiFi Suite of products enables WiFi monetization through rich location data, marketing engagements, and third-party sponsorships. GoZones venue intelligence enables enterprises to strategically refine operations, bridging the gap between marketing and IT. Learn more at GoZoneWiFi.com.

About Forum Info-Tech:

Forum Info-Tech specializes in educating our clients on the information technology options available to ease business IT concerns and implement the best solution. Our professional scope ranges from engineering and implementing on-premise network solutions, designing and migrating to cloud solutions, business continuity, and data recovery solutions, and consulting on various IT projects. Our network and technical engineers combined experience allows us to successfully provide custom, affordable solutions to our valued clients. For more information, please learn more at http://www.foruminfotech.net.

View source version on businesswire.com: https://www.businesswire.com/news/home/20221129005396/en/

Contacts

Media Inquiries: christine@cbpartnerspr.com + 1 714 206 9800

Read the original:

GoZone WiFi and Forum Info-Tech Provide Toyota Arena and Ontario Convention Center with Innovative Portfolio of IT Solutions to Self-Manage, Automate...

Categories
Cloud Hosting

How Cloud Computing Will Drive The Future Of Data Analytics – HostReview.com

Cloud computing and data analytics are the present-day superheroes that can run a business solely on their shoulders. Since the introduction of cloud services in 2006, it has evolved so much, and many big players in the tech industry, like Amazon, Microsoft, IBM, Oracle, and Adobe almost all tech giants, big and small, are offering cloud services at various levels.

Data analytics, on the other hand, is the study of statistics and is as old as pyramids, literally! Ancient Egyptians used census statistics for the building of pyramids. Statistics played an important role for governments all over the world in the creation and classification of censuses, distribution of goods, and collection of taxes et al. Data analysis is the process of collecting data from various sources and studying it to extract useful information.

With the introduction of computers, the power of computation has increased tremendously, and it helped data analysis to look deep into the data to find various answers that can be benefitted from. While cloud computing is a modern technological marvel, data analysis existed long before. Before finding out the combined power of the two, lets look at their strengths to understand them better.

Cloud Computing:

Cloud computing had given all business enterprises the option of not buying a cow when they need a packet of milk. The analogy may seem funny, but that was the case with all the businesses before the cloud. Previously, business enterprises used to spend a bomb on IT infra, most of which was used just as a backup to face any eventualities or when the situation demands. When not in use, these IT resources used to occupy a lot of space and waste a lot of energy in terms of the power they consume and, eventually the money. Cloud services have eliminated all this wastage by simply offering every IT resource as a service. Now, businesses can buy a service and pay for what they use and need not own that IT infrastructure.

Cloud computing services are mainly of 3 types. Infrastructure as a Service, IaaS, where businesses can rent infrastructure like servers, networks, virtual machines, operating systems, and storage without having to buy these costly entities. This way, the businesses like e-commerce platforms can make use of these services by scaling up or down depending on the demand; for example, during festive seasons, they can scale up the resources as the traffic is expected to peak, and on other occasions when the traffic is low, they can scale down the resources. This saves a ton of burden for the businesses as they need not worry about scaling their hardware and can concentrate on other important business aspects. Thats one use of cloud computing, scalability of hardware on demand!

The second category of cloud computing service is Platform as a Service, PaaS. The cloud service offers a complete development framework and deployment environment for any web application for the customers. Just like in IaaS, PaaS also includes infrastructure such as servers, networks, and storage, and in addition also includes development tools, database management services DBMS, business intelligence BI tools, and all the middleware required to build and maintain a complete web application lifecycle right from inception, build, test, deploy, manage and update! IaaS is a subset of PaaS in the sense that PaaS offers all the services of IaaS and more.

The third category of cloud computing service is Software as a Service, SaaS where the cloud provider hosts a software application for the end users to make good of it for a price. The software provider can either host the software application, and all the related databases using its own servers and resources and offer services like Microsoft Azure or, in the case of an independent software vendor, ISV can use a cloud provider's help to host his application. These applications can typically be accessed through web browsers and used by B2B and B2C users. SaaS is a superset of IaaS and PaaS.

By using any of the cloud-mentioned computing services, a business can benefit in many ways. The first and foremost benefit of using cloud services would be a drastic cutting in IT costs. Scalability is another main advantage of using a cloud-based service. Also, since cloud services can be accessed from anywhere, businesses can longer confine their manpower to offices alone. The manpower can be scattered anywhere and can work using an internet connection.

This mobility can especially be useful for startups and small businesses where employees can work from anywhere with any device cutting significant costs on premise rent and other amenities. One of the most important benefits of hosting your data on the cloud is security. On-site data storage is as good as the hardware on which it resides and is vulnerable but when you move your data to a cloud, you can rest assured as long as your cloud provider is operational, your data is in safe hands. Cloud service providers offer data security not only from physical theft, natural disasters, and damage but also from hackers and data burglars.

The above gives you an idea of cloud services and how a business can benefit immensely. Lets move on to our next superhero of a business, data analytics.

Data analytics:

Have you ever encountered a puzzle on social media where you need to find a hidden figure embedded inside an image within 30 seconds and if you find it successfully, they claim you have extraordinary IQ? You might be familiar with finding the differences between two almost identical images in magazines or perhaps you would recollect those questions from a competitive exam where you need to find the number or shape that comes next in a series of numbers or shapes. Do you know what is the one thing that is required to solve these types of questions where at first these images, shapes, and numbers appear to be very ordinary and typical? Analytical observation! The ability to analyze the given data and arrive at a conclusion is called analytical thinking or observation.

Data analytics are used to analyze what appears to be meaningless, raw, and abstract data to discover, interpret, and communicate meaningful patterns from that data. The interpretations and patterns thus obtained can be used to optimize a business and can help perform it more efficiently by providing various insights. Since data will be huge and cannot be interpreted by humans, various techniques and processes used to interpret the data have been automated into various mechanical processes, and algorithms were designed to be used on raw data. Implementing data analytics in a business will optimize its performance in various ways such as reducing operational costs by identifying more efficient ways of conducting business, making better decisions, finding customer behaviors, and discovering new business opportunities and trends.

In todays digital age, every business needs data for its success as data conveys many important things. If a business ignores the data, it may miss out on some important opportunities or, even worse, may crash the business. Based on what type of analytics you use on the data, the same is divided into 4 major types. Descriptive analytics on data will give you a description of what happened in the past. This type of data analytics will help you look into what happened previously to plan a course of action. Diagnostic analytics of data will diagnose why and how something has happened, whether good or bad. If it is good, a business can follow it and if it is bad, a business can examine ways to prevent it. Predictive analytics predict future trends based on the current data. These predictions will help prepare businesses for big business opportunities or eventualities. Prescriptive analytics will help guide a business by offering various courses of action on how to proceed based on the given data.

All 4 categories of data analytics mentioned above will help a business to understand its current situation through descriptive analytics, how it got there through diagnostic analytics, where it is headed through predictive analytics, and finally, how to proceed further through prescriptive analytics. Depending on the problem you are facing in your business or trying to set the future goals of your business, you may choose all or some of the data analytics.

But to apply the various analytics, you need the raw material, which is data. The more data you analyze, the more accurate the results will be.

The combination:

Now that we have understood how a business can benefit from both cloud computing and data analytics, lets look at what the combination does to a business.

Theres an ocean of data, all right! But to mine those huge volumes of data and convert it into actionable info, you need to have a powerful infrastructure in-house, more so if the data is stored on-site. Applying data analytics on such in-house stored data is a daunting task. This is where the cloud comes in. Many cloud platforms offer storage solutions to move your enormous data onto the cloud. Not only that, apart from offering cloud-based storage solutions, these cloud platforms also offer integrated cloud analytics to apply to your data.This offers various benefits to a business, the first being not having a host of infrastructure at your premises. The second big benefit is you can easily build customized reports based on the geographical location of your business branches as the data pertaining to all these different locations is now available on the cloud. The third big advantage is the cloud analytics the cloud platform offers are much faster, more robust, and safe than the in-house ones. And lastly, the data will be safe and secure on a cloud rather than on-premise!

All these advantages of combining cloud computing with data analytics will increase the efficiency of a business by arriving at various action plans and staying ahead!

As far as the question of how cloud computing will drive the future of data analytics, it is already driving. In the future, too, we can see these two evolving superheroes shouldering many big businesses to their successes!

Read more:

How Cloud Computing Will Drive The Future Of Data Analytics - HostReview.com

Categories
Cloud Hosting

SA’s online threats spiked in October – IT-Online

The November Edition of Trellixs Cyber Threat Intelligence Briefing for South Africa has shown a dip from around 2,6-million total files detected in August, to 2,4-million in September, before shooting back up, past the 2,7-million mark, in October.

This top line data measured all files, including malicious and innocuous files, with public utilities, education institutions and financial services organisations recording the highest incidences.

On closer inspection, the volume of malicious threat campaigns saw a spike, from just over 5 000 files to over 20 000 in September and back down to over 10 000 in October. By far, the highest detected threat was of the MyKings Botnet Clipboard Stealer.

With South African investors entering the cryptocurrency market at a faster pace than ever, we are becoming a more attractive target for global cybercriminals as crypto has become their biggest target, says Carlo Bolzonello, country manager at Trellix South Africa. Alarmingly, the MyKings malware is aggressively used to install itself on machines to download crypto wallets and addresses, allowing hacking groups to clear out users crypto wallets.

Other common threats over the period include: An offspring of the Vega Stealer, the Zeppelin (Buran) ransomware group, which originated out of the United States and has proliferated globally, predominantly targeting the financial services and communications sectors.

Vice Society ransomware group is predominantly known for exploiting system vulnerabilities, especially where organisations may be slow to institute patches for prior threats. Threat actors will typically leverage access brokers, who sell the relevant tool on the dark web.

Crackonosh Malware, which is distributed in cracked software, as well as the Telerik UI Exploitation, which leads to malware infection by exploiting the ASP.net patch last updated in 2019.

Long-term trends

Two of the leading threat actors that have emerged in the South African landscape since the beginning of 2022 are the MuddyWaters Group and UNC1945, which target the banking, finance outsourcing, and hosting services, as well as utilities.Using similar techniques and some of the same tools (like Ligolo and impacket), these infiltrate environments to steal credentials. MuddyWaters might also go further, leaving ransomware in environments, while selling credentials on to third-parties, once acquired.

South Africa has seen a growing emergence of threat actors, using tools like CrackMapExec and BadPotato, which are quite openly available and conduct surreptitious vulnerability assessments of systems to access privileges, Bolzonello says. Other Threat Actors, like APT28 will go after cloud accounts and infrastructure, moving laterally on systems with minimal detection.

Staying abreast of some of these evolving threats will require a comprehensive strategy for cloud-hosted and on-premise threat detection using live data from security operations centres (SOC), he adds.

Related

More here:

SA's online threats spiked in October - IT-Online

Categories
Cloud Hosting

Developing Applications in Space with Azure Orbital Space SDK – InfoQ.com

Microsoft recently announced the preview release of the Azure Orbital Space SDK to provide developers with a secure hosting platform and application toolkit designed to enable them to create, deploy, and operate applications on-orbit.

With the SDK, developers have access to templates, samples, and documentation to make it easy to get up and running with template applications for common workload patterns, such as earth observation image processing. In addition, a "virtual test harness" allows developers to quickly test their applications on the ground against an instance of the host platform.

Source: https://azure.microsoft.com/en-us/blog/any-developer-can-be-a-space-developer-with-the-new-azure-orbital-space-sdk/

Furthermore, the blog post explains that developers can leverage the SDK to write and host more intelligent applications on-board satellite capturing data, use time more efficiently, and even autonomously reconfigure applications. The SDK provides a standard template for completing imaging activities, making it simpler to transfer models and applications from one satellite configuration to another - preventing developers from writing a new solution each time they launch a spacecraft application.

In addition, the SDK also enables more sophisticated management of satellite communications by providing a compute fabric with networking capabilities for hosting telecommunication workloads. Operators can migrate applications more quickly from on-ground cell sites to satellites in space, providing higher resiliency and network utilization.

Steven Kitay, a senior director at Microsoft Azure Space, told InfoQ:

At Microsoft, we are on a mission to combine the power of the cloud with the possibilities of space, and the SDK is the latest way to turn those possibilities into realities. The Azure Orbital Space SDK makes it easier for space developers to create secure applications that run on spacecraft. They can build applications to task, acquire, and process imagery and downlink it from satellites to ground stations. This approach has many benefits for developers and satellite operators alike, including saving time since the data is captured at the edge, allowing for reconfigurability, and providing satellite interoperability and scale.

Earlier in April, Microsoft launched the Azure Space Partner Community and disclosed its first space community partners, including Loft Orbital, Ball Aerospace, and Thales Alenia Space. With the SDK preview release, the company adds Xplore, who will help them continue to shape the future of space technologies and services. It includes leveraging the SDK to gather new insights into how edge computing solutions can better enable government and commercial customers to achieve their mission objectives. Furthermore, Microsofts existing community partners will leverage the SDK.

The Azure Orbital Space SDK is available via private preview for companies through the Azure Space Partner Community and universities through the Azure Space Academic Outreach Program.

Read more here:

Developing Applications in Space with Azure Orbital Space SDK - InfoQ.com

Categories
Cloud Hosting

Keep Your Cloud Secure: A Fitness Routine for Your Cloud Environment | – Spiceworks News and Insights

Companies are increasing their cloud adoption. Simultaneously, cloud environments face both the security challenges of on-premises environments and new ones that arise from their core benefits. Hence, companies should become smarter about their defenses. Here, Mark Kedgley, CTO, Netwrix, shares the best practices regarding cybersecurity defenses.

I find that the second week of a diet is easier than the first I have always given up by then! While we all know that the only way to achieve lasting fitness is to eat smarter and be active, it is difficult to stop looking for a magic pill. We want to believe that just a kale smoothie will deliver the results we want.

Similarly, there are no shortcuts to attaining strong cybersecurity, and many organizations are falling short of their goals. Netwrix recently surveyed over 700 IT security professionals, and there were a couple of findings that should grab everyones attention:

Source: Netwrix 2022 Cloud Security Report

Even as the threat to cloud IT systems grows, organizations are increasing their cloud adoption. About 54% of workloads are planned to be in the cloud by the end of 2023, compared to 41% today. Accordingly, it is vital to get a lot smarter about cybersecurity defenses.

As with fitness, strong cybersecurity requires disciplined, consistent practice. It is not quite no pain no gain, but it is much more than just buying a SIEM system and configuring some firewall rules. Indeed, cloud environments face both the security challenges of on-premises environments and new ones that arise from their core benefits, such as:

See More: Cloud Security Posture Management: Four Ways To Clear Your Clouded Vision

Let us assume that a strategic business case has already been made to migrate to the cloud. Today that often happens when the realization dawns that a new data center will be needed or a hardware refresh is coming around. The eye-watering costs and the anticipated logistical challenges almost inevitably lead to the conclusion that cloud computing would make life much better.

A key question that decision-makers should consider: are we re-hosting, re-platforming, or re-architecting? The answer is largely driven by whether or not the assets in question are in-house developed applications and the current state and future direction of IT services. For most organizations, it is a combination of all three paths because every application has different requirements for now and moving forward. If you are stuck with any legacy applications running on old platforms, then it is likely that a hybrid cloud is coming your way. Then you will have the opportunity to reap the benefits of DevOps with a CI/CD pipeline and instantly refreshed, elastic, container-based microservices applications down the line!

From a security standpoint, the cloud is highly attractive if it removes your data center security and business continuity responsibilities. However, even though you will no longer have a physical data center to secure, you will need to implement new access security controls and get a clear understanding of the activities and rights of your in-house resources and those of the service provider.

Start with the basics. One fundamental security best practice is the principle of least privilege. But that principle is as likely to be flouted in the cloud as on-premises. It is simply easier to over-provision accounts than to tailor rights as tightly as possible, much as it is easier to overindulge in treats or skip todays workout than to stick to your fitness plan. For help, look to cloud infrastructure entitlement management (CIEM) tools that facilitate processes like regular entitlement reviews to accurately enforce the least privilege, as well as monitor user activity and maintain clear and complete audit trails. Also, consider adopting a zero standing privilege (ZSP) approach in which privileged access is granted only temporarily, on demand, when required.

Multifactor authentication (MFA) offers another layer of identity security, helping to prevent the hijacking of credentials. In many cloud environments, MFA is offered as a configurable option but is not a default setting. Organizations need to weigh the benefits of increased security against the risk of user frustration and productivity losses.

Pre-built images provide a good starting point for hardening an environment. It is vital to remember that hardening is not a one-time operation; you also need automated, continuous monitoring for drift backed by effective reporting and alerting. It is rather like an exercise log that helps you keep your fitness program on track.

However, effective change control can be a steep challenge. You need a consistent picture across all cloud systems in use, including hybrid and private clouds, as well as the traditional data center and legacy IT platforms and applications. And on top of gaining complete visibility into all changes, you need to understand whether each change was planned or unplanned, good or bad, expected or potentially malicious. Again, there are tools and technologies that can help you achieve and maintain a hardened cloud or hybrid infrastructure.

Cloud technologies and platforms are comparatively new, so none of us have as much experience with the challenges as we do with systems like Linux and Windows. So set the alarm clock and get to the gym early as soon as you finish your workout, there is another busy day of cloud security to get on with!

What are the best practices you are following to keep your cloud environment secure? Share with us on Facebook, Twitter, and LinkedIn.

Image Source: Shutterstock

Visit link:

Keep Your Cloud Secure: A Fitness Routine for Your Cloud Environment | - Spiceworks News and Insights

Categories
Cloud Hosting

US Department of the Interior seeks $1b single-vendor cloud contract – The Register

The United States Department of the Interior has posted a final solicitation for a $1 billion cloud computing services contract that runs for 11 years and will be awarded to a single vendor.

According to the final request for proposals [PDF], the department wants a "single Indefinite Delivery, Indefinite Quantity contractor" that will deliver cloud hosting across the DoI.

The department, among other things, looks after National Parks. It is supposed to protect the US's natural resources and manage them as commercial entities (it employs park rangers at the Grand Canyon for example).

Under Cloud Hosting Service III, it wants a single Virtual Private Center for cloud services to support its cloud and managed service requirements.

Under previous Secretary of the Interior Ryan Zinke, the department controversially opened some of those federal lands to oil, gas and coal prospectors.

Just two months ago, a federal judge put a stop to coal leases on public lands, with the Obama-era moratorium reinstated under current Secretary of the Interior Deb Haaland, a member of the Native American Pueblo people who has advocated for investments from the Bipartisan Infrastructure Law to help clean up legacy pollution.

Now the department is looking for a single vendor for a "green IT" solution, saying the main drivers of the effort are a push towards cloud migration and datacenter consolidation, with the notice saying it hoped to "reduce the IT footprint for agencies," consolidate "traditional DCs" (implying it plans to cut a few), and save on hardware. It also claimed the contractor would help terh Department improve its "overall IT security posture" and shift IT investments to more "efficient" computing platforms and technologies.

The winning bidder will need to meet Federal Risk and Authorization Management Program (FedRAMP) security requirements, so likely will be one of the larger cloud service providers (CSPs), so what it can actually do for carbon offsets will vary though theoretically there would be environmental economies of scale.

The security aspect will be worthwhile according to a 2020 pen-testing report from the department's inspector general, among other failings, the DoI internal wireless network could be broken into over the air using a smartphone and under $200 worth of electronics stuffed into a backpack.

Previously, the DoI contracted its cloud under a multi-vendor contract, with previous tussles over services to the department including a knockdown fight between Microsoft and Google when the department found the latter's apps were not sufficiently "secure."

In 2013, the DoI spent $10 billion on its Foundation Cloud Hosting Services, contracted out to 10 vendors including Verizon.

The current $1 billion project is dwarfed by the Department of Defense's Joint Warfighting Cloud Capability mega-deal, the JEDI replacement that handed up to $9 billion to AWS but whoever scores this one, at least, will not have to divide the spoils.

See original here:

US Department of the Interior seeks $1b single-vendor cloud contract - The Register

Categories
Cloud Hosting

Singapore’s government cloud saves country 50% in hosting costs – IT PRO

The Singapore government has managed to save around 50% of its hosting costs after moving to the cloud, according to a recent cost-benefit study.

Around 60% of the government's eligible systems have been migratedthe government commercial cloud (GCC), said Janil Puthucheary, senior minister of state at the ministry of communications and information, and minister-in-charge of GovTech, the nations digital transformation agency, after a continued effort to shift more reliance on cloud infrastructure over time.

New systems are now being developed directly on GCC by default, resulting in significantly shorter lead times, and ensuring the governments engineers and partners become more experienced in cloud development and deployment, said the minister.

Moving software development onto the cloud also allows developers to focus less on infrastructure and compliance, and more on developing application logic, saidPuthucheary. This saves countless hours and reduces human errors.

The minister underlined that the government hasnt been able to grow its cloud capabilities or create a strong tech culture by itself. He said that partnerships are vital and the GCC wouldnt have been possible without these links to cloud service providers.

One of the partnerships the government has established iswith the Singapore Financial Data Exchange (SGFinDex) which was developed in collaboration with seven participating banks. It gives individuals a consolidated view of their financial information held across different government agencies and financial institutions.

Hybrid cloud for video surveillance

What it is and why you'll want one

Its the worlds first public digital infrastructure to use a centrally managed online consent system, backed by a national digital identity, saidPuthucheary. SGFinDex has expanded beyond government agencies and banks, to include insurance companies, and so enhancing the financial planning process for our users.

The cloud has been a focus for the government recently after it chose Microsoft in February 2022 to develop a sovereign cloud to accelerate the digital transformation of the countrys Home Team Science and Technology agency (HTX).

Its set to be built on the tech giants Azure platform and provide HTX with high-performance cloud computing and data storage capabilities. At the time, the agency was set to use the infrastructure to quickly adopt and produce new technologies.

Build innovation, intelligence and sustainability into your industrial processes, with the cloud

EMEA Manufacturing & Industrial Symposium 2022

Enabling secure hybrid learning in schools

The importance of creating security awareness among key players

Access new levels of creative freedom

Discover the benefits of 3D powered design

Sharpen your manufacturing competitive edge

Smarter asset management

See the original post:

Singapore's government cloud saves country 50% in hosting costs - IT PRO