New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content.
Data science toil saps agility and prevents organizations from scaling data science efforts efficiently and sustainably. Here’s how to avoid it.
Legacy networking approaches don’t align with the way that cloud providers create services or access and only introduce more complexity. Move to the cloud, but leave your traditional networking behind.
Like Kubernetes itself, the underlying object storage should be distributed, decoupled, declarative, and immutable.
An overview of the strengths and weaknesses of today’s cloud database management systems.
For any company exploring the potential of the cloud and Kubernetes, adopting infrastructure as code, security as code, and automation will be essential.
Empowering cloud teams with automated policy-as-code guardrails helps them move faster and more securely.
Much of the software we use today is built on re-implemented APIs, like the Java API in question in Oracle v. Google. An Oracle victory would have stopped open-source innovation in its tracks.
Today, companies across every industry are deploying millions of machine learning models across multiple lines of business. Soon every enterprise will take part.
SLAs are for lawyers. Service level objectives not only introduce finer-grained reliability metrics, but also put that telemetry data into the context of user happiness.
Shipping software has always been about balancing speed and quality control. Many great technology companies built their empires by mastering this skill.
Algorithmic biases that lead to unfair or arbitrary outcomes take many forms. But we also have many strategies and techniques to combat them.
Data science can make robotic process automation more intelligent. Robotic process automation make it easier to deploy data science models in production.
We wouldn’t roll our own cloud orchestration or payment processing software. Why are we still building our own authorization infrastructure?
AI has the power to liberate organizations from CRM-related manual processes and improve customer engagement, sales insights, and social networking, for starters.
Grafana Tempo is an open source, easy-to-use, high-volume distributed tracing system that takes advantage of 100% sampling, and only requires an object storage back end.
Time series forecasts are used to predict a future value or a classification at a particular point in time. Here’s a brief overview of their common uses and how they are developed.
The next frontier for data processing is a new platform capable of delivering insights, actions, and value the instant data is born.
Time series analysis involves identifying attributes of your time series data, such as trend and seasonality, by measuring statistical properties.
An open, services-oriented approach has clear advantages for building modular and scalable applications. We should take the same approach to our data.
Visualizing time series data is often the first step in observing trends that can guide time series modeling and analysis.
Apache Pulsar is an open source streaming platform that addresses some important limitations in Kafka, particularly for cloud-native applications.
Time series data key insights in domains ranging from science and medicine to systems monitoring and industrial IoT. Understand time series data and the databases designed to ingest, store, and analyze time series data.
How the fully managed Kafka service can bring peace and simplicity to the lives of those who depend on event streaming infrastructure.
PostgreSQL continues to improve in ways that meet the needs of even the most complex, mission-critical use cases. It also presents certain challenges.
ProxyJump forwards the stdin and stdout of the local client to the destination host, allowing us to set up jump servers without giving them direct SSH access.
At Snowflake, we fully embrace the value of open standards and open source. But we strive to avoid misguided applications of open that create costly complexity instead of low-cost ease of use.
Processes and process automation take many forms. Here’s how to navigate the growing ecosystem of tools for automating everything from simple repetitive tasks to complex custom workflows.
Pulling from container registries is key to ensuring the health and resilience of the CI/CD pipeline. Choose your registry with care.
An in-memory digital integration hub enables flexible, real-time information flow between mainframes and external systems, unlocking mainframe data for digital transformation.
You can avoid command line tedium and simplify access to a fleet of servers by creating a flexible configuration file for your SSH client. Here’s how.
As Moore’s Law loses steam, off-loading data compression, data encryption, low-level data data processing, and other heavy-duty computation tasks to storage nodes makes sense. Here’s how that would work.
9 out of 10 companies have accelerated their cloud adoption in response to the coronavirus pandemic, with corresponding increases in cloud spend—and waste.
By decoupling policy from applications, policy as code allows you to change the coding for policy without changing the coding for apps. Translation: reliability, uptime, and efficiency.
Enterprise-grade explainability solutions provide fundamental transparency into how machine learning models make decisions, as well as broader assessments of model quality and fairness. Is yours up to the job?
The past 12 months have revealed how valuable data science can be while also exposing its limitations. Expect big advances in the year to come.
A standard operating environment can reduce the time it takes to deploy, configure, maintain, support, and manage containerized applications. Let’s get SOEs and containers back together.
Build a React application to track the orbit of the International Space Station using Telegraf, InfluxDB, ExpressJS, and Giraffe.
Aerospike’s Cross-Datacenter Replication with Expressions makes it easy to route the right data at the right time across global applications to meet compliance mandates and reduce server, cloud, and bandwidth costs.
A biased AI model must have learned a biased relationship between its inputs and outputs. We can fix that.
6 steps IT departments can take to identify IT spend and optimize reinvestments in digital transformation initiatives.
3 conscious experience design principles that will help you integrate ‘living’ AI into your applications and how to implement them.
After a trying 2020, signs point to data science becoming an enterprise-wide capability that impacts every line of business and functional department in the coming year.
Developments in containers and virtualization, container tooling, containers for edge computing, and Kubernetes you should have on your radar in the coming year and beyond.
Devops teams are flocking to GitOps strategies to accelerate development time frames and eliminate cloud misconfigurations. They should adopt a similar ‘as-code’ approach to policy.
CockroachDB 20.2 brings a Kubernetes operator, spatial data, a new storage engine, SQL enhancements, and much more, extending the range of workloads for which the database can be used.
With Sentry Performance Monitoring for PHP, developers can quickly identify performance issues with PHP-based applications and view end-to-end traces to pinpoint issues and surface related errors.
Why we must use a zero-trust security model in microservices and how to implement it using the Kuma universal service mesh.
Viable, large-scale quantum computers will require better qubits, better control chip technology, more advanced error correction, and new components at every layer of the stack. We’re making progress.
How Open Policy Agent allows developer teams to write and enforce consistent policy and authorization across multicloud and hybrid cloud environments
By combining machine learning and adaptive query execution, query optimization in Presto could become smarter and more efficient over repeated use.
Sponsored Links