mul-ti-plexer-er. noun. A device, in electronics, that synthesizes disparate data signals into a single, uniform output. ZDNet Multiplexer merges various perspectives, media types, and data sources and synthesizes them into one clear message, via a sponsored blog.
ZDNet Multiplexer allows marketers to connect directly with the ZDNet community by enabling them to blog on the ZDNet publishing platform. Content on ZDNet Multiplexer blogs is produced in association with the sponsor and is not part of ZDNet's editorial content.
BROUGHT TO YOU BY
Back To The Future of Real-Time Applications
Real-time systems have a long history. A new economic tipping point means it's now time to go back to the future...
For the last few decades, operations and analytics have been firmly separated in enterprise architectures, with different systems for the different needs.
This idea is far from new -- for example, here's the text from a Univac computer ad in 1956:
"...there's only one commercially available platform capable of real time performance... It's the ideal system for...simulation and on-line data reduction.
It solves complex problems from purely sensed data at speeds that are compatible with real-time control.
Because of its ability to reduce large volumes of data at tremendous speeds, the...system easily handles even the most difficult problems.
Furthermore, it offers many other outstanding characteristics, including: superb operating efficiency, large capacity, great versatility, the ability to interface with a wide variety of different types of data, and far greater reliability than any other computer of its type...."
Just like some of today's "internet of things" systems, the Univac scientific machine was designed to process information in-memory, directly from sensors (it could also access information from magnetic tape or punch cards).
Note how perfectly the ad ticks all the modern buzzword boxes, promising the 3Vs of big data (volume, variety, and velocity), along with powerful analytics and data compression.
Unfortunately for business in the 1950s, rising amounts of data and cheaper disk storage quickly upended the economics of computing -- and we have had to deal with the complexity and latency of separate operational and analytic systems ever since.
We're now seeing a new tipping point. The falling price of memory and the increased costs of complexity mean that in-memory platforms like SAP HANA are once again the simplest and cheapest way of running businesses.
The result is that although moving to HTAP entails "upheaval in the established architectures" according to Gartner, we're actually getting back to the fundamentals of what we've always wanted.
Can
an
open
source-based
workflow
leveraging
version
control
and
continuous
integration
and
deployment
help
streamline
machine
learning,
like
it
did
for
software
development?
...
Azure
ML
jumps
on
the
Arc
bandwagon,
Power
BI
Premium
adds
per-user
pricing
and
autoscale.
Azure
Purview
onboards
Synapse,
Amazon
S3,
and
on-prem
Oracle
and
SAP
as
scannable
data
...
Microsoft
is
introducing
a
new
Apache
Cassandra
Azure
service,
while
adding
new
onramps
to
Azure
Synapse
Analytics.
And
it’s
the
first
third-party
cloud
to
cleanroom
engineer
its
...
With
90,000
cameras
already
installed
in
major
public
locations
across
the
island,
the
Singapore
government
is
looking
to
deploy
"many
more"
of
the
"game
changer"
devices
as
well
...
The
private
and
multi-cloud
extension
of
IBM
Public
Cloud
is
now
becoming
available.
What
will
differentiate
it
will
be
the
IBM
Cloud
PaaS
services
that
come
to
this
platform.
...
Join Discussion