Artificial intelligence (AI), like so many processes today, requires sudden capacity spikes. Data often comes in waves and, if processed in real-time, your AI applications must be able to scale. If these spikes are significant, developing a scalable application on premises can be expensive, maybe even unaffordable.
Cloud computing with its pay-as-you-go model has really democratized scalability, making AI a lot more feasible. As we’ll explore in this article, there are many benefits to running AI in the cloud vs on premises, such as significant cost savings. We’ll start with a little history, the difference between running AI on-premises and in the cloud, and what it takes to implement an AI in the cloud approach.
In the early days, AI research experienced ups and downs as scientists around the globe went from excitement to disappointment as they tried to develop a smart computer. As they envisioned a new way to approach AI, their theories were quickly shut down by a lack of high performance processing power. A good example of this is the deep neural networks development process.
It all started with a paper in 1943, which found that our brain can be simulated by a computer if neurons are represented as connected digital processing gates. The simple idea behind artificial neurons was that they can be trained to either fire or not fire when given a set of inputs. If multiple artificial neurons were connected, a decision could be made by evaluating which neurons have fired and which did not. This is the same principle by which our brains operate.
It wasn’t until 1954 that the first successful artificial neural network was created, which contained only 128 neurons due to memory limitations. The network was trained to recognize a pattern of 1s and 0s and predict the next number in the sequence. In addition, the researchers discovered that removing 10% of the neurons in the network did not hinder its performance, which is somewhat similar to how our brain operates when it gets slightly damaged.
Despite this success, neural network research and funding was mostly halted. The most prominent reason for the budget cuts was that computers were not powerful enough. It could have taken weeks to train a simple pattern recognition network. Today, however, training the same 128 neuron network will not even make a noticeable spike in your computer’s resources. Not to mention, the training will be done in about two minutes and not in the weeks that it took back then.
Another reason for failure of these early AI systems is that there wasn’t enough data to train them (known now as machine learning or ML). Organizations did not understand the value of data back then and so they did not collect any. Even if they knew that data is valuable, there was almost no means of storing it in such large volumes, since data storage was not cost efficient like it is today.
However, in the 1980s, continued research and funding resulted in more sophisticated algorithms and the availability of more powerful computers. Only at this time did people start believing that AI was a possibility.
Perhaps the most notable example of AI in the past was the Deep Blue computer – a system designed to play chess. Deep Blue is believed to be the first computer to win a chess match against a world champion, largely thanks to its 11.38 gigaflops of processing power. Its predecessor, Deep Thought, wasn’t bad at playing chess either, but it doesn’t even come close to its younger brother Deep Blue.
The chess playing computers of the time worked using a simple principle of brute force. By evaluating every possible combination of moves, and what those moves would lead to, the computer could determine the best strategy. In simpler terms, the further you can evaluate the better move you can pick. Some argued that it wasn’t true AI, but the grandmasters that played with the machines told stories of sensing a kind of intelligence behind the artificial opponent’s moves.
Even though Deep Thought couldn’t compete against a world champion, in 1988 it did win a regular tournament game against a grandmaster. At the time, it could evaluate about 10 moves ahead. For comparison, in 1996 Deep Blue could evaluate a massive 20 steps ahead, largely due to its enhanced performance and very parallelized architecture. It is the possibility of being able to connect a bunch of computers together in an efficient manner that helped Deep Blue win that match.
Since those days, computer speeds have increased exponentially, and machine learning and AI are once again the center of attention. Today, we can run sophisticated algorithms using only the data processing power of our laptops. Even the once great Deep Blue system cannot compare to what we carry in our pockets every day.
A supercomputer that cost over $100 million dollars and filled a small room in 1993 would have had 600 gigaflops with 2048 processors. The Apple iPhone A13 Bionic has 600+ gigaflops on 18 cores, costs about $1000, and fits in your pocket.
As we invent devices that are designed for quicker calculation, such as AI Processing Units (APUs) and Tensor Processing Units (TPUs), we are pushing the boundaries of processing power.
Until recently, any AI component assumed learning on top of data, which somehow was externally generated and captured. The most sophisticated chess algorithms used chess games played by humans played as a starting learning point. As a result, the knowledge of the neural network for the AI chess player was limited to the knowledge what people could capture. Despite this, the most modern AI player can overplay any human chess player, because it can evaluate more steps ahead than humans can. However, the strategy the AI player chooses to play is still human-like, due to the nature of the data used for training.
In 2017 this changed. DeepMind, the AI subdivision of Google, created a computer program based on a neural network architecture called AlphaZero. Instead of searching for the best moves using algorithms and data created by humans, it was designed to learn how to play games itself. It took 24 hours of self-learning, knowing only the chess game rules, to create the new world champion chess player. During this time, the program played 44 million matches against itself to train the neural network to find and memorize the best playing strategies. This massive amount of training in such a short time was possible largely thanks to the 5000 TPUs the machine could use in parallel.
After these 24 hours of training, AlphaZero achieved a superhuman level of play and defeated world-champion programs Stockfish, Elmo, and its predecessor AlphaGo Zero. In 100 games against Stockfish, AlphaZero won 25 games as White, three games as Black, and scored a draw in the remaining 72 games. During the matches, AlphaZero ran a previously trained neural network on a single machine with four application specific TPUs and 44 CPU cores. AlphaZero is the first step of the future AI, when AI can build the knowledge and strategies that supersede the human capabilities.
Our advance in hardware sounds like a great technological leap forward, but the biggest leap of all is AI in the cloud. The most popular cloud providers (AWS, Azure, and GCP) put teraflops of computer power at our fingertips. As with AlphaZero, that incredible amount of compute power can be used to train your AI model at a large scale in the beginning and use it with limited and economically viable resources at the end.
Creating AI has always been a monumentally difficult task. Having the ability to train large scale models without investing thousands of dollars into infrastructure makes the process a lot easier, not to mention it is a dream come true for data scientists.
Next, we will explore what makes cloud-based AI so great and why it is better than investing in on premises infrastructures. We will cover typical machine learning and AI development tools which allow for the creation of datasets, training models, and inferencing.
The move to a cloud platform seems scary at first because it is unfamiliar compared to on-premises, which has been around for ages. On-premises presents the illusions of greater control, easier maintainability, easier scalability, and enhanced security. This illusion stops people from making the switch to cloud services. By examining what the cloud has to offer and comparing an on-premises approach to the cloud approach, we hope to break the illusion and to show that using the cloud for AI and machine learning is a correct approach.
First, let’s examine the benefits of using cloud versus on-premises or data center solutions. You could start out by creating your models on a laptop, as most of us do. This is a good way to create a proof of concept model, where there is not much performance required and the benefits of cloud are not as apparent. Sooner or later you will start noticing bottlenecks in your setup, since fitting a model will render your laptop useless for any other task.
At that point, a decision needs to be made; you could go out and buy a dedicated machine learning computer or rent a computer from leading cloud providers with state-of-the-art machine learning set up. If you make the decision to spend some cash on a dedicated system, you will once again run into the same issues down the road when there is not enough power. On the other hand, using a cloud solution, all you must do is rent a slightly more expensive computer instance, or even run multiple cheap instances in parallel. Scalability and parallel processing are the main reasons why AI in the cloud is great.
The next challenge with on-prem is setting things up. When using your own hardware, you must configure each software or library that you are using. Even a simple task such as installing proprietary GPU drivers on your machine is tedious work. With a cloud solution, you can wave goodbye to wasting time, since all the software and libraries are pre-configured and ready to go.
Cloud providers give you assurances that if you select to install one of the supported libraries, it will work out-of-the-box and will scale without causing any issues. It will also be tightly integrated with the other solutions offered by the cloud provider. Another hidden benefit of using pre-installed software on your cloud machine is that, in some cases, the software is an optimized version of the open source solution that the public has access to. Just like that, by using the libraries in the cloud there can be a noticeable performance boost in some cases.
Redundancy and security are also cloud computing benefits since you know that the files you put on the server will stay there until you delete it and they won’t be accessed by unwelcome people. By storing multiple copies of your data, you can be assured that your data is backed up in the event of a catastrophe.
In addition, cloud servers can be configured to make sure that only the proper personnel can access your data or your instances. All cloud providers use Role-Based Access Control (RBAC) which allows users to create user roles to dictate who has access to your systems. Cloud providers also have built-in solutions to handle major data and security compliances like DPR, HIPAA, FedRamp and others. This much cannot be promised about the on-premises solutions.
Every cloud provider has an extensive suite of services in their portfolio designed to decrease the workload and speed up delivery time for AI researchers. In addition, everything in the cloud is designed to work together and make life easier. The services can be broken down into data preparation and modeling.
To make any kind of artificial intelligence, you need big data (meaning a lot of data) that the machine can use for learning. Because clouds are easily scalable, you are able to process terabytes of data in a span of hours. There are plenty of tools that are configured and ready to be used as soon as you boot up a computer in the cloud.
Let’s say you would like to create a dataset by downloading some information from the internet and processing it. The best way to go about it would be to use Apache Spark because of its ability to parallelize your transformation algorithms. You could try to use Amazon EMR, GCP DataProc, or Azure Databricks to quickly create a cluster that is set up for autoscaling. Even though these solutions come from different cloud providers, they are always set up and ready to go with the newest releases of Apache Spark.
After your infinite data has been processed you can store it using one of the integrated database utilities offered by cloud providers. Amazon Aurora, for example, can be used to create a scalable relational database. If relational databases do not float your boat, try GCP Cloud Bigtable – a low latency scalable non-relational database. If you are after something fresh and new, then you might want to experiment with Delta Lake on Azure, which can help you start your organization’s next big data project.
Regardless of your data use case, the cloud is ready and integrated with almost anything you might want. No more configurations or spending money on server racks, just pay for what you use and sleep well knowing that your data is protected and has multiple layers of redundancies.
Now that we covered the data part, we can move on to the modeling. Any AI project will require you to create a machine learning model using one of the proven algorithms or a custom solution. By setting up your project in the cloud you gain an advantage over your competition. Most cloud providers offer multiple machine learning state-of-the-art models that are already trained and ready to be used. If using someone else’s models is not your style, you can use model creation tools that the cloud provides to speed up your delivery time.
AWS offers a fully managed software solution, which allows anyone to create, test, and deploy models with ease. SageMaker removes the burden of creating a robust machine learning architecture by providing fully integrated popular frameworks in one bundle. This solution allows developers and data scientists to save time setting up the environment and get right into creating the models.
SageMaker Studio is an integrated development environment (IDE) which gives you access to built-in tools so you can automatically train any type of model, monitor and debug the training process, collaborate faster using SageMaker Notebooks, and deploy the model to production. Trying to recreate all these processes will take enormous effort from the developers and data scientists, it will also cost much more than simply running it in the cloud. Keep in mind that this approach does require you to know how to write code. However, using the next solution, you are not required to code at all.
If you are new to machine learning and are not too familiar with the popular machine learning frameworks, then Azure Machine Learning Studio might be for you. With its simple drag and drop user interface, you don’t need to write any code at all. Simply select prepared modules that complete a certain task, put them in the proper order, and let Azure handle the rest. Note that writing custom code is allowed, but it is totally up to the user to decide if they need a more custom approach.
Azure ML Studio allows you to drag and drop modules such as importing datasets, performing transformations on your dataset, training and evaluating the trained model, etc. Azure will transform your architecture into a working model and will give you an ability to publish it to a production inference server. The entire experience is so easy that you can create a working model in a span of several hours without any coding experience.
To help data scientists, Google is taking a similar approach to Amazon SageMaker. Instead of a “coding-free” option, Google offers a scalable environment using their TPU servers for additional performance boost. In addition, the AI platform was not designed to prepare and transform your data sets. In the documentation, Google explains that their service fits in the spectrum of efficient model training, evaluation, deployment, and monitoring. Therefore, if you are planning to use this software solution, you should have a prepared dataset that you can use.
More and more businesses are migrating their applications to the public cloud or hybrid cloud. The level of affordability that autoscaling can provide and the level of redundancy cannot be matched by an on-premises solution.
Deep Blue once shook the world by presenting what highly parallel computing power can do to further our quest for artificial intelligence. The cloud now puts this same power at our fingertips at a much higher order of magnitude than was available to us before. It would be unwise not to use it for AI integration and research.
In the cloud everything is preconfigured, managed, and easy to use. It is a perfect place to try out new frameworks, train highly scalable AI models, or plain simple run cheap experiments of any kind. Whatever your use case may be, make sure to try AI in the cloud first instead of buying more server racks.
Increasingly complex AI and ML models require more data to be trained effectively - what are the ways of tackling the increasing power needs?Read more
The demand for faster and more efficient software delivery has led to the emergence of DevSecOps, a combination of DevOps and security practices. In this article series, EastBanc Technologies explores some of the top trends in DevOps, starting with DevSecOps.Read more
It's no secret that the business landscape is changing. In order to stay ahead of the competition, it's necessary to undergo a digital transformation. In this blog, we'll outline the steps you need to take in order to make sure your business is prepared for the future.Read more
Get ready to witness the power of artificial intelligence as it transforms the military, retail and personalized medicine industries! From revolutionizing defense strategies to streamlining the shopping experience to providing customized healthcare solutions, AI is changing the game in ways we never thought possible.Read more
In the context of EastBanc Technologies' dev approach, MVP stands for Minimal Viable Product. An effective method of quickly establishing a framework for a digital solution, the MVP streamlines the process of a product’s initial deployment. We take a look at the fundamentals of the MVP and how to construct one.Read more
Solar energy is a promising – and green – alternative to fossil fuels. As long as the sun is shining. Check out how AI helps solar energy providers optimize output, manage supply & demand and reduce the price of electricity using predictive analytics and machine learning. This is AI at work, making gigantic strides for worldwide adoption of this renewable energy source.Read more
Wind energy has the potential to cover much of the world's insatiable thirst for electricity in a sustainable way. Unfortunately, the wind doesn't always blow -- and not always with the same intensity. Using AI and machine learning models, energy producers and scientists are finding new ways to maximize the output and efficiency of wind energy.Read more
The potential for Artificial Intelligence (AI) in the green energy industry is rapidly gaining momentum. Renewable energy sources such as solar power are complex and unreliable due to constantly changing weather conditions, but AI can help remove obstacles and unleash the true power of solar.Read more
Apple’s introduction of passkeys with the latest versions of its MacOS and IOS operating systems means is a major step forward for online identity management, but passkeys will not hand us complete control over our own online identities. For that to happen, we need to look at Self-Sovereign Identity (SSI).Read more
Cryptocurrencies are notoriously volatile. Indeed, the rapid rises and vertigo-inducing plunges can make even the most stout-hearted crypto investors tremble. But with time, Bitcoin, et al may settle into a more temperate pattern and become a stable - or even a centerpiece - of our financial systems.Read more
Artificial intelligence will continue to disrupt many industries, and the best way to maximize the impact of the technology is to start teaching it early. Weaving AI learning into high school curricula will create a strong link between technologies and curious students, fostering future employees well trained in the digital world. -- benefiting business and driving innovation.Read more
A dive into the implementation of the blockchain in finances, smart contracts and NFTs.Read more
Blockchain is a word that is now heard everywhere, but not everyone has a clear understanding and knows what is there under the hood. In our second part of the blockchain guide let's dive deeper into the technology and concepts behind it.Read more
Teaching computer science to teenagers is a no-brainer in today's digital world. Here's why weaving artificial intelligence and machine learning into the high school curriculum can increase the growth of innovative technologies like never before.Read more
While traditional computers continue to evolve and pump out more raw power, they are no match for the quantum computer, which can tackle calculations that the most powerful conventional machines would need decades to process – in a split second.Read more
HackTJ 2002 is in the books, bringing together more than 400 bright young minds eager to tackle real-world problems with creative technology solutions. As a Gold Sponsor for the event, EastBanc Technologies created three challenges for the young innovators, and we are delighted to announce this year's top contestants -- and their winning hacks.Read more
This is EastBanc Technologies 3rd year sponsoring HackTJ, and our participation includes designing three challenges for teams to hack. The challenges will explore how to alleviate some of the world’s most pressing issues impacting our personal and professional lives.Read more
One “new” technology that has stuck is Blockchain. To understand what Blockchain is, you only need to know three things. What is a block? What is a chain? What is a ledger?Read more
Modern technology brings the world closer together, but millions of people continue to be left behind. The "digital divide" is multifaceted and impacts society in a variety of ways. These are some of the technologies that are helping bridge the gap.Read more
Artificial intelligence (AI) can be found almost everywhere in modern life. Learn key lessons and best practices that help companies avoid common AI pitfalls and achieve ROI from their AI systems.Read more
Open Data fuels today's digital economy, enables communication and innovation, boosts business and generally makes our lives easier. But how do we protect privacy if everything is open? Zero-knowledge proof could be the answer.Read more
Blockchain capabilities, including fully-automated data storage and transparency, make it an essential technology for cybersecurity. In this article, we look at some of its use cases.Read more
DevOps built-in flexibility allows development teams to work at a level that suits their resources and skills without being held back by departmental barriers.Read more
Artificial Intelligence (AI) – the capability of a machine or piece of software to display human-like intelligence – permeates our daily lives, often in ways we do not notice.Read more
Data-driven software touches our lives every day. Sometimes, it is in ways you see, such as when you check your Twitter feed, pay for your bus ticket or order your latte using your phone.Read more
EastBanc Technologies is recognized on CIOReview’s list: “Most Promising Microsoft Azure Solution Providers.”Read more
In this article, we’re going to dig a bit deeper into AI-implementation. We will take our airline use case a step further, and we will describe a specific example of how EastBanc Technologies solved a particularly challenging problem through AI and machine learning.Read more
If your organization provides a product or service -- which applies to just about any business on the planet -- you, too, can benefit from Artificial Intelligence (AI). While implementing AI may sound daunting, it doesn't have to be complex or expensive. This article covers the basics of AI and looks at some easy-to-explore use cases.Read more
Digital transformation is about opportunity and survival. Businesses that transform digitally gain a significant competitive advantage.Read more
Part 2: Best practices for modernizing your company’s IT infrastructure to ensure innovation success.Read more
Best practices for modernizing your company’s IT infrastructure to ensure innovation success.Read more
Learn how machine learning engineers and data scientists collaborate and roll out models faster and with ease using Azure Machine Learning.Read more
What is DevOps, what are DevOps practices, and how do you implement DevOps? Your FAQs answered.Read more
Refactor, rewrite, or leave as is? Learn when and how to bring your legacy systems up to speed with modern application development practices.Read more
Learn how technology can better meet your business needs with this foundational understanding of how software and system architectures work.Read more
Software is a strategic differentiator that can catalyze digital transformation. Organizations are investing in technology, such as modern cloud services, to drive efficiencies and increase the customer experience. To make this a reality, it’s essential that business leaders have a basic understanding of business software and applications work and the opportunities they bring.Read more
How an intelligence-driven customer technical support approach can transform your support from a reactive operation to a streamlined, efficient, and proactive operation.Read more
Kubernetes is a popular container orchestration system, but how did it come to be and why, and what role does it play in digital transformation?Read more
Continuous integration and continuous delivery (CI/CD) is integral to a DevOps approach to software development. But what is CI/CD and why is it key?Read more
This article is the third in a series that aims to demystify data science , machine learning, deep learning, and artificial intelligence (AI) – while exploring how they are interconnected.Read more
2020 has seen profound change in the way we live and work with COVID-19 accelerating the pace of digital transformation. Yet, business leaders are often confused about how to implement one of the key enablers of...Read more
Artificial intelligence (AI), together with its brethren buzzwords data science, machine learning, and deep learning have been around for some time now and are no longer future concepts. Yet misconceptions persist about the true meaning of these terms.Read more
When SUSE, the world’s largest independent open source company, announced its acquisition of Rancher Labs in early July 2020, the industry took notice. Clearly, the Kubernetes management industry is very much alive.Read more
We live in a technology-driven world. Even non-technology companies are seeing their business models increasingly shaped by technology. Led by disrupters such as Amazon and Netflix, those enterprises who recognized opportunities early have found ways to extend the analog experience into a digital one. Even creating new revenue streams that they could never have predicted.Read more
Digital transformation is about delivering core competencies in a digital, automated, and user-centric manner. Driven by data and powered by tech (e.g. cloud, cloud native stack, AI, machine learning, and deep learning), it increases business agility, competitiveness, and enhances customer value.Read more
Let’s start by understanding where DataOps falls in the line-up of current IT methodologies. DataOps is the next level up from ETL (extract, transform, and load) and MDM (master data management systems) in terms of organizing data and processes. It can also be thought of as a methodology that combines DevOps and Agile within the field of data science.Read more
The hotel industry hasn’t changed much in the past decades. While they have introduced some level of digitization such as websites and apps, they haven’t fully embraced digital transformation. Indeed, if things are working fine, why change? Because the next unforeseen disruptor may be right around the corner.Read more
The term “DataOps” has picked up momentum and is quickly becoming the new buzz word. But we want it to be more than just a buzz word for your company, after reading this article you will have the knowledge to leverage the best of DataOps for your organization.Read more
Unstructured text is found in many, if not all business functions, and can become a source of valuable insight. Product reviews will guide your customers’ preferences, customer support chats can identifyRead more
Disclaimer: We have not spoken to a WeWork executive and have no further background information. This is merely a thought experiment to exemplify what digital transformation is about.Read more
In part one of this series, we defined data science and explored the role of a data scientist — including data preparation, modeling, visualization, and discovery. We also introduced the role of a machine learning engineer who closely collaborates with the data scientist.Read more
Big data continues to grow exponentially creating a critical need for solutions that can make sense and extract valuable information from it. For example, the Internet is full of a wide variety of constantly growing text sources— blog posts, forum posts, chats, message boards, item and services reviews, etc.Read more
Kubernetes, the de facto container orchestrator, is great and should be part of any DevOps toolkit. But, just as any other open source technology, it’s not a full-fletched ready-to-use platform.Read more
With the increasing popularity of machine learning (ML), it’s becoming more difficult for data scientists to find the appropriate tools for a specific task and decide on a robust approach. Should they stick to the basics and code everything from scratch or use one of the many pre-built tools that keep popping up on the market?Read more
Blue-green deployments and canary releases mitigate application deployment risk by enabling IT to revert back to the previous version should an issue occur during the release. Switching back and forth between versionsRead more
For those who were still debating whether they should hop on the digital transformation bandwagon, the COVID-19 crisis was a wakeup call, maybe even a slap in the face.Read more
The entire business world is talking about digital transformation. IT leaders, on the other hand, talk about DevOps, cloud native, Kuberentes and containers.Read more
If your organization leverages technology as a differentiator, a DevOps approach to application and service delivery is inevitable. The benefits are just too great.Read more
Digital transformation is one of today’s biggest buzzwords. Everyone is talking about it; everyone wants it. We all know the role technology is playing in enabling businesses to innovate at an unprecedented pace.Read more
The data on big data indicates that up to 60% of analytics projects fail or are abandoned, costing companies an average of $12.5 million. That’s not the result we seek from data lakes. Instead, companies are increasingly finding themselves mired in data swamps that are overfilled and too muddy to offer any useful visibility. Or are they?Read more
We collect data at a mind-boggling pace. In fact, as companies, we’re hoarding it. But what good is data if it can’t speak to us? Fortunately, data complexity can be broken down through design and visualization – the charts, graphs and plots that show trends, outliers and opportunities.Read more
As a company and as a team, our lives at EastBanc Technologies have always been about tackling the biggest problems for the biggest organizations.Read more
Artificial intelligence (AI) surrounds us. It unlocks our phones, creates our shopping list, navigates our commute, and cleans spam from our email. It’s making customers’ lives easier and more convenient.Read more
Nearly every week there’s something new in our industry. The pace of technology is unprecedented, the role of IT is booming, and innovation is part of our DNA.Read more
Technology is accelerating at such a rate that it permeates all industries. In fact, software is the only industry that cuts horizontally across all verticals.Read more
Innovation is a critical part of business. While prioritizing production in general makes sense, the best approaches make innovation a component of the whole production process.Read more
We recently sat down with a large pharmaceutical company to discuss their data analytics projects. What we heard wasn’t a surprise. Three of the four large analytics efforts they undertook last year had failed.Read more
AMS Group is a cohesive group of established companies that provide technology and security equipment to aerospace, defense, and security markets.Read more
A European market leader in online survey and feedback software acquired complementary companies in different Wester European countries, each of which had its own survey platform.Read more
Everyone loves their own data. Collecting it. Analyzing it. Drawing conclusions from it. But often, when you allow departments or business units within your organization to gather their own data, that data isn’t shared.Read more
Gartner predicts that through 2017 60% of big data projects will fail to go beyond piloting and experimentation and ultimately will be abandoned.Read more
Organizations generally understand the power behind analytics, but how do you make it work culturally and technically? We take a look at the barriers to data analytics success and suggest new approaches that buck the system, with dramatic results.Read more
And how to make your next data analytics project succeed?Read more
Container use is exploding right now. Developers love them and enterprises are embracing them at an unprecedented rate.Read more
If you’re making the move to containers, you’ll need a container management platform. And, if you’re reading this article, chances are you’re considering the benefits of Kubernetes.Read more
Wouldn’t it be nice to reach artificial intelligence (AI) nirvana? To have a system that provides real-time, context-aware decisions.Read more
Today’s IT environment is moving and evolving at an unprecedented pace. So, all of a sudden, your 5-year old software infrastructure can look more like it’s 50. To get your software current – and stay there – requires flexibility. Moving to containers does just that. There’s been lots of talk about containers over the past few years – so why aren’t you on the bandwagon yet?Read more
Under pressure to deliver applications faster and ensure 24/7 runtime, organizations are increasingly turning to DevOps methodologies to deliver applications quicker and in an automated fashion. But what tools should you have in your DevOps toolkit?Read more
Amazon Web Services (AWS), Azure, and Google Cloud Platform (GCP) are the public cloud market leaders, but how do you determine which of them best supports your enterprise's specific needs? For most enterprises, and for the foreseeable future, it’s going to be a multiple answer question.Read more
As the dominant movie rental service in the 90s and early 2000s, Blockbuster was the market leader, seemingly indefatigable. Until the great disruptor, Netflix, hit the scene.Read more
Big Data. Everyone’s paying for it, collecting it, and talking about it, but what are companies actually doing with it?Read more
The API management market is a hot one. As more organizations make investments in mobile, IoT, and big data, APIs are a core of their digital strategy.Read more
Big data is everywhere. Organizations are being advised to hoard it and do everything they can to derive actionable insights. This article will argue that this approach puts the cart before the horse.Read more
Let’s face it. Organizations struggle with their legacy applications. Even when they still solve some of the business’ problems, they reach a point where they can no longer keep up with market and industry demands.Read more
Let’s flash back to 2000. You’ve survived Y2K and you’re building systems for CRM, inventory, logistics, or data. They’re all state-of-the-art, and get the job done, even if they don’t talk to each other.Read more
It’s a mobile app world, and we just live in it. But for those working on the “next big thing,” there’s a conundrum – everyone knows we should be building apps in HTML, but not every device out there runs it as smoothly as it should.Read more
In technology, everyone likes to talk about “future-proofing.” But even for the most cutting-edge tech, time always catches up.Read more
The future is here. No, we don’t have flying cars or robot butlers – yet – but it’s definitely a digital world.Read more
We’re excited to announce Microsoft Azure support for the Kubernetes auto scaling module, an open source system for automating deployment, scaling, and management of containerized applications.Read more
You can’t mention enterprise technologies today without getting into a discussion about the cloud. “Are you in the cloud yet?” Why jumping headlong into cloud computing may not be the necessary move for your business.Read more
In the mad rush to move to the cloud, some organizations put the proverbial cart in front of the horse. They’re just looking for the best hosting, the preferred provider, or whatever the rest of the industry is using.Read more
2016 saw momentum in many areas – DevOps, cloud technologies, and big data- at the thrust of innovation. So, what tech predictions will define 2017?Read more
Every month, week, or day, it seems there’s buzz about yet another solution or service that will revolutionize your industry – or more simply, make your life easier.Read more
Apps. Sensors. They’re everywhere. Your phone, your car, your TV, even your refrigeratorRead more
In an increasingly commoditized market, learn how to cut through the noise and forge a cloud strategy that meets your needsRead more
Fleet management is a challenging business. This is particularly true of snow removal services where the dynamics on the ground can change fast and the pressures to perform put fleet supervisors to the test – in the toughest of conditions.Read more
Long before the first flakes fall from the sky many municipalities begin to prepare for the cold, icy, and snowy conditions that inevitably lie ahead.Read more
Fun fact: in 2014, cloud services were already a $45 billion business worldwide, and are expected to grow to $95 billion by 2017. Will you be part of that equation?Read more
Simple is good. Simple is clean. And whether I’m cooking or planning a trip, simple is always better, right? So why do so many companies make user experience (UX) so complex?Read more
Future-ready predictive analysis infrastructures hold the key to gaining insights from data today, and into tomorrow.Read more
Immersive and exciting, Virtual Reality is already part of our lives, whether it’s a plot device in a new sci-fi thriller or the best way to enjoy the latest video games or thrill rides.Read more
Now that smartphones are the most widely used tool for navigating important life activities (nearly two thirds of Americans own one), there’s pretty much an app for everything these days.Read more
If you’re tasked with choosing an API management system, Charles Dickens summed it up best: “It was the best of times, it was the worst of times.”Read more
DevOps: the panacea for all that’s wrong with enterprise IT. Where siloed teams who keep information close to their chest are replaced by agile, transparent relationships between developers and operations and fast and stable workflows that improve IT efficiency significantly and very visibly.Read more
As a technology company focused on complex project integrations that unify legacy systems as well as modular solutions that ensure lasting scalability, we work on a multitude of projects that involve custom software development; packaged, open source, and SaaS software integration; infrastructure setup; and production operations and maintenance.Read more
In an earlier blog we talked about why you need to integrate API management into your business strategyRead more
In a previous release of “What the Tech?” we discussed why you should integrate API management into your business strategy.Read more
Smart cars, smart homes, smart devices. The Internet of Things (IoT) is already transforming how we live. But very soon, the IoT will swiftly extend into the enterprise.Read more
Why you Need to Integrate API Management into your Business StrategyRead more
The promise of big data is, well, big! With terabytes of intelligence at their disposal, organizations can make faster, more accurate decisions, monitor trends, and even predict the future.Read more
Businesses accumulate data, create content, or possess unique business logic—each of which represents an untapped business opportunity. But how can organizations realize that opportunity?Read more
The Internet of Things (IoT) is much more than a consumer trend, it’s rapidly changing the way enterprises are using data to improve business decision-making.Read more
Content consumption is changing rapidly. With multiple channels and media formats, reaching target audiences is getting harder than ever.Read more
The way in which we consume content is changing rapidly and a few trends have emerged recently that we think will have a meaningful impact on media organizations this year and in years to come.Read more
Building a mobile app isn’t as simple as it used to be. With multiple devices to cater to, development teams must ask themselves a few questions:Read more