deephaven.io Open in urlscan Pro
173.236.153.244  Public Scan

Submitted URL: http://deephaven.io/
Effective URL: https://deephaven.io/
Submission: On May 18 via api from GB — Scanned from GB

Form analysis 1 forms found in the DOM

POST https://forms.hsforms.com/submissions/v3/public/submit/formsnext/multipart/19516115/fa7281db-6864-4acc-9bca-7fba98b7d6f2

<form novalidate="" accept-charset="UTF-8" action="https://forms.hsforms.com/submissions/v3/public/submit/formsnext/multipart/19516115/fa7281db-6864-4acc-9bca-7fba98b7d6f2" enctype="multipart/form-data"
  id="hsForm_fa7281db-6864-4acc-9bca-7fba98b7d6f2" method="POST"
  class="hubspot-form newsletter-form hs-form hs-form-private hsForm_fa7281db-6864-4acc-9bca-7fba98b7d6f2 hs-form-fa7281db-6864-4acc-9bca-7fba98b7d6f2 hs-form-fa7281db-6864-4acc-9bca-7fba98b7d6f2_b8f5e119-0cc9-4924-9570-eb59e1868f75"
  data-form-id="fa7281db-6864-4acc-9bca-7fba98b7d6f2" data-portal-id="19516115" target="target_iframe_fa7281db-6864-4acc-9bca-7fba98b7d6f2" data-reactid=".hbspt-forms-0" data-hs-cf-bound="true">
  <div class="hs_email hs-email hs-fieldtype-text field hs-form-field" data-reactid=".hbspt-forms-0.1:$0"><label id="label-email-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="" placeholder="Enter your Email"
      for="email-fa7281db-6864-4acc-9bca-7fba98b7d6f2" data-reactid=".hbspt-forms-0.1:$0.0"><span data-reactid=".hbspt-forms-0.1:$0.0.0">Email</span><span class="hs-form-required" data-reactid=".hbspt-forms-0.1:$0.0.1">*</span></label>
    <legend class="hs-field-desc" style="display:none;" data-reactid=".hbspt-forms-0.1:$0.1"></legend>
    <div class="input" data-reactid=".hbspt-forms-0.1:$0.$email"><input id="email-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="hs-input" type="email" name="email" required="" placeholder="" value="" autocomplete="email"
        data-reactid=".hbspt-forms-0.1:$0.$email.0" inputmode="email"></div>
  </div>
  <div class="hs_firstname hs-firstname hs-fieldtype-text field hs-form-field" data-reactid=".hbspt-forms-0.1:$1"><label id="label-firstname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="" placeholder="Enter your First name"
      for="firstname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" data-reactid=".hbspt-forms-0.1:$1.0"><span data-reactid=".hbspt-forms-0.1:$1.0.0">First name</span></label>
    <legend class="hs-field-desc" style="display:none;" data-reactid=".hbspt-forms-0.1:$1.1"></legend>
    <div class="input" data-reactid=".hbspt-forms-0.1:$1.$firstname"><input id="firstname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="hs-input" type="text" name="firstname" value="" placeholder="" autocomplete="given-name"
        data-reactid=".hbspt-forms-0.1:$1.$firstname.0" inputmode="text"></div>
  </div>
  <div class="hs_lastname hs-lastname hs-fieldtype-text field hs-form-field" data-reactid=".hbspt-forms-0.1:$2"><label id="label-lastname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="" placeholder="Enter your Last name"
      for="lastname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" data-reactid=".hbspt-forms-0.1:$2.0"><span data-reactid=".hbspt-forms-0.1:$2.0.0">Last name</span></label>
    <legend class="hs-field-desc" style="display:none;" data-reactid=".hbspt-forms-0.1:$2.1"></legend>
    <div class="input" data-reactid=".hbspt-forms-0.1:$2.$lastname"><input id="lastname-fa7281db-6864-4acc-9bca-7fba98b7d6f2" class="hs-input" type="text" name="lastname" value="" placeholder="" autocomplete="family-name"
        data-reactid=".hbspt-forms-0.1:$2.$lastname.0" inputmode="text"></div>
  </div>
  <div class="legal-consent-container" data-reactid=".hbspt-forms-0.2">
    <div class="hs-richtext" data-reactid=".hbspt-forms-0.2.2">
      <p>Deephaven Data Labs needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy
        practices and commitment to protecting your privacy, please review our Privacy Policy.</p>
    </div>
  </div>
  <div class="hs_submit hs-submit" data-reactid=".hbspt-forms-0.5">
    <div class="hs-field-desc" style="display:none;" data-reactid=".hbspt-forms-0.5.0"></div>
    <div class="actions" data-reactid=".hbspt-forms-0.5.1"><input type="submit" value="Submit" class="button button--primary" data-reactid=".hbspt-forms-0.5.1.0"></div>
  </div><noscript data-reactid=".hbspt-forms-0.6"></noscript><input name="hs_context" type="hidden"
    value="{&quot;rumScriptExecuteTime&quot;:2467.8999996185303,&quot;rumServiceResponseTime&quot;:3010.2999997138977,&quot;rumFormRenderTime&quot;:2.1999998092651367,&quot;rumTotalRenderTime&quot;:3013.699999809265,&quot;rumTotalRequestTime&quot;:531.7999997138977,&quot;renderRawHtml&quot;:&quot;true&quot;,&quot;lang&quot;:&quot;en&quot;,&quot;embedType&quot;:&quot;REGULAR&quot;,&quot;legalConsentOptions&quot;:&quot;{\&quot;legitimateInterestSubscriptionTypes\&quot;:[12096948],\&quot;communicationConsentCheckboxes\&quot;:[{\&quot;communicationTypeId\&quot;:12096948,\&quot;label\&quot;:\&quot;I agree to receive other communications from Deephaven Data Labs.\&quot;,\&quot;required\&quot;:false}],\&quot;legitimateInterestLegalBasis\&quot;:\&quot;LEGITIMATE_INTEREST_PQL\&quot;,\&quot;communicationConsentText\&quot;:\&quot;<p>Deephaven Data Labs is committed to protecting and respecting your privacy, and we’ll only use your personal information to administer your account and to provide the products and services you requested from us. From time to time, we would like to contact you about our products and services, as well as other content that may be of interest to you. If you consent to us contacting you for this purpose, please tick below to say how you would like us to contact you:</p>\&quot;,\&quot;processingConsentType\&quot;:\&quot;IMPLICIT\&quot;,\&quot;processingConsentText\&quot;:\&quot;<p>In order to provide you the content requested, we need to store and process your personal data. If you consent to us storing your personal data for this purpose, please tick the checkbox below.</p>\&quot;,\&quot;processingConsentCheckboxLabel\&quot;:\&quot;<p>I agree to allow Deephaven Data Labs to store and process my personal data.</p>\&quot;,\&quot;privacyPolicyText\&quot;:\&quot;<p>Deephaven Data Labs needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.</p>\&quot;,\&quot;isLegitimateInterest\&quot;:true}&quot;,&quot;embedAtTimestamp&quot;:&quot;1652905590261&quot;,&quot;formDefinitionUpdatedAt&quot;:&quot;1636489667371&quot;,&quot;pageUrl&quot;:&quot;https://deephaven.io/&quot;,&quot;pageTitle&quot;:&quot;Real-time query engine | Deephaven&quot;,&quot;source&quot;:&quot;FormsNext-static-5.496&quot;,&quot;sourceName&quot;:&quot;FormsNext&quot;,&quot;sourceVersion&quot;:&quot;5.496&quot;,&quot;sourceVersionMajor&quot;:&quot;5&quot;,&quot;sourceVersionMinor&quot;:&quot;496&quot;,&quot;timestamp&quot;:1652905590261,&quot;userAgent&quot;:&quot;Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.64 Safari/537.36&quot;,&quot;originalEmbedContext&quot;:{&quot;portalId&quot;:&quot;19516115&quot;,&quot;formId&quot;:&quot;fa7281db-6864-4acc-9bca-7fba98b7d6f2&quot;,&quot;target&quot;:&quot;#form-fa7281db-6864-4acc-9bca-7fba98b7d6f2&quot;,&quot;cssClass&quot;:&quot;hubspot-form newsletter-form&quot;,&quot;errorMessageClass&quot;:&quot;text--danger&quot;,&quot;submitButtonClass&quot;:&quot;button button--primary&quot;},&quot;renderedFieldsIds&quot;:[&quot;email&quot;,&quot;firstname&quot;,&quot;lastname&quot;],&quot;formTarget&quot;:&quot;#form-fa7281db-6864-4acc-9bca-7fba98b7d6f2&quot;,&quot;correlationId&quot;:&quot;ea621124-3efd-40ef-898a-3c63f2349326&quot;,&quot;hutk&quot;:&quot;0fa56dbd59e67d962e870a5168fcf02a&quot;,&quot;captchaStatus&quot;:&quot;NOT_APPLICABLE&quot;}"
    data-reactid=".hbspt-forms-0.7"><iframe name="target_iframe_fa7281db-6864-4acc-9bca-7fba98b7d6f2" style="display:none;" data-reactid=".hbspt-forms-0.8"></iframe>
</form>

Text Content

Skip to main content
Products
 * Community Core
 * Enterprise
 * Professional Services

Company
 * About
 * Solutions
 * Newsroom
 * Careers

Docs
 * Community Docs
 * Enterprise Docs

BlogTry Demo
SearchK



QUERY
STREAMING
DATAFRAMES

Open-core* query engine for building apps and analytics with real-time streams
and batch data

Try Live Demo

or start with docker

curl https://raw.githubusercontent.com/deephaven/deephaven-core/main/containers/python/base/docker-compose.yml -O
docker-compose pull
docker-compose up



11/t = quotesAll.aj(quotesSpy, "Timestamp", "WtdMid_Spy = WtdMid")\
.updateView("Ratio = WtdMid_Spy / WtdMid")
Deephaven has been battle-tested inside prominent hedge funds, investment banks,
and stock exchanges, managing billions in assets. Every day.

DATA SYSTEM

A powerful query engine and framework providing tools and experiences for the
whole team


DATA SOURCES

Access and ingest data directly from popular, standard formats.


DATA PROCESSING

Build applications and do analytics using Python, Java, or C++, with a single
abstraction for batches and streams. Use time-series features and complex joins.
Combine custom functions with table operations on both ticking and static data.
Works with familiar libraries like Pandas, TensorFlow, Numba.


DATA CONSUMERS

Exhaust new streams or write to persistent stores, build and share real-time
visualizations and monitors. Explore massive and ticking datasets with built in
tools. Build enterprise apps.

WHY DEEPHAVEN?


STREAMING DATA
DONE RIGHT


SERIOUS
PERFORMANCE

Engineered to track table additions, removals, modifications, and shifts, users
benefit from Deephaven’s highly-optimized, incremental-update model. A
chunk-oriented architecture delivers best-of-class table methods and amortizes
the cost of moving between languages.

Client-server interfaces are designed with large-scale, dense data in mind --
moving compute to the server and providing lazy updates.


BUILD, JOIN, AND PUBLISH STREAMS WITH EASE

Build streams on streams to empower applications and do analysis. Use table
operations or marry them to custom and third-party libraries. Query and combine
batch and real-time data.


HIGHLY
INTUITIVE

New data and events seamlessly arrive as simple table updates. Queries establish
an acyclic graph, with data logically flowing to downstream nodes. Simply name a
source or derived table to make it available to clients via multi-language APIs.
Use easy methods to stripe and pipeline workloads.


FAMILIAR &
POWERFUL TOOLS

Leverage gRPC and Arrow. Use Jupyter, Visual Studio, JetBrains, or [soon] R
Studio. Bring your custom or 3rd-party libraries and functions to the data for
faster and well-integrated execution. Enjoy the data interrogation experiences
of the Code Studio, with dynamic dashboards and an evolving suite of
capabilities.




EXPRESSIVE LANGUAGE


BUILT FOR DEVELOPERS, LOVED BY DATA SCIENTISTS

COMBINE YOUR STATIC AND REAL-TIME DATA SOURCES

Join and merge Kafka streams with Parquet files. Use identical operations on
batch and stream.

JOIN YOUR TIME SERIES AND AGGREGATE

STREAMLINE YOUR DATA SCIENCE

LEVERAGE GRPC AND ARROW FLIGHT LIBRARIES

from deephaven import ConsumeKafka, ParquetTools, TableTools
from deephaven2.parquet import read_table

# data-ingestion integrations (Kafka, Parquet, and many more)
table_today_live = ConsumeKafka.consumeToTable(
    {"bootstrap.servers": "kafka:9092"}, "metrics"
)
table_yesterday = ParquetTools.readTable("/data/metrics.parquet")

# merging dynamic with static is easy; the updating table will continue to update
table_merged = TableTools.merge(table_today_live, table_yesterday)

# operators can be used identically on dynamic and static tables (or merges of the two)
table_joined = table_today_live.sumBy("ProcessKey").naturalJoin(
    table_yesterday.sumBy("ProcessKey"), "ProcessKey", "YestTotal = Metric"
)




bitcoin = ConsumeKafka.consumeToTable({"bootstrap.servers": "kafka:9092"}, "bitcoin")
ethereum = ConsumeKafka.consumeToTable({"bootstrap.servers": "kafka:9092"}, "ethereum")

# time series joins update as source tables update
priceRatio = (
    bitcoin.aj(ethereum, "Timestamp", "SizeEth = Size, PriceEth = Price")
    .update("Ratio = Price / PriceEth")
    .renameColumns("SizeBtc = Size")
)

# time-bin by minute and aggregate accordingly
agg = priceRatio.update("TimeBin = upperBin(Timestamp, MINUTE)").by(
    ["TimeBin"],
    [
        AggAvg("Ratio"),
        AggMin("MinRatio = Ratio"),
        AggMax("MaxRatio = Ratio"),
        AggSum("Size", "SizeBtc"),
        AggWAvg("SizeBtc", "VwapBtc = Price"),
    ],
)




import numpy as np
from sklearn.linear_model import LinearRegression

# write a custom function
def computeBeta(value1, value2):
    stat1 = np.diff(np.array(value1), n=1).reshape(-1, 1)
    stat2 = np.diff(np.array(value2), n=1).reshape(-1, 1)
    reg = LinearRegression(fit_intercept=True)
    reg.fit(value1, value2)
    return reg.coef_[0][0]


# filter, sort and do time-series joins on source tables
iot = source.where("MeasureName = `Example`").view(
    "TimeInterval", "DeviceId", "MeasureValue"
)
iot_joined = iot.aj(iot.where("DeviceId = `Master`"), "TimeInterval", "Measure_Master")

# use the custom function within the deephaven object directly
# no client-server or copy
betas = (
    iot_joined.by("DeviceId")
    .select(
        "DeviceId",
        "Beta = (double) computeBeta.call(Measure_Master.toArray(), MeasureValue.toArray())",
    )
    .sort("DeviceId")
)



 * Java Client
 * Python Client
 * C++ Client
 * JavaScript Client

FlightSession session = newSession();
TableSpec trades = readQst("trades.qst");
TableSpec quotes = readCsv("quotes.csv");
TableSpec topTenTrades = trades
    .aj(quotes, "Timestamp", "Mid")
    .updateView("Edge=abs(Price-Mid)")
    .sortDescending("Edge")
    .head(100);
try (
    final Export export = session.export(topTenTrades);
    final FlightStream flightStream = session.getStream(export)) {
    while (flightStream.next()) {
        System.out.println(flightStream.getRoot().contentToTSVString());
    }
}




from pydeephaven import Session
from pyarrow import csv

session = Session()  # assuming DH is running locally with the default config

table1 = session.import_table(csv.read_csv("data1.csv"))
table2 = session.import_table(csv.read_csv("data2.csv"))
joined_table = table1.join(
    table2, keys=["key_col_1", "key_col_2"], columns_to_add=["data_col1"]
)

df = joined_table.snapshot().to_pandas()
print(df)
session.close()




auto client = Client::connect(server);
auto manager = client.getManager();
auto trades = manager.fetchTable("trades");
auto quotes = manager.readCsv("quotes.csv");
auto topTenTrades = trades
    .aj(quotes, "Timestamp", "Mid")
    .updateView("Edge=abs(Price-Mid)")
    .sortDescending("Edge")
    .head(100);
std::cout << topTenTrades.stream(true) << '\n';




class TableView {
  setFilter() {
    this._filters = Array.prototype.slice.apply(arguments);
    return this._table.applyFilter(this._filters);
  }
  addFilter(filter) {
    this._filters.push(filter);
    return this._table.applyFilter(this._filters);
  }
  // Use cloning when you want to create a new table
  // to apply filters without modifying the existing table.
  clone(name) {
    if (!name) {
      name = `${this._name}Clone`;
    }
    return this._table.copy().then((newTable) => new TableView(name, newTable));
  }
}





UI TOOLS


OPEN-SOURCE CODE STUDIO FOR ACCELERATED DATA EXPLORATION

Browser based interactive REPL for immediate feedback.
Industry leading data-grid, handles billions of rows with ease.
Plot large data sets with automatic downsampling.
Auto-complete column names for rapid data exploration.



BUILD WITH DEEPHAVEN


WHAT CAN YOU BUILD WITH DEEPHAVEN?

Use one of the following example apps or starter projects to get going fast

INHERIT KAFKA STREAMS AS UPDATING TABLES

A demo of two ways to consume events.

Use an interactive demo

COMBINE STREAMING FEEDS WITH PYTHON

A machine that uses Twitter and colors to solve WORDLE.

Watch the video

DRIVE UX WITH WEB-SCRAPED CONTENT

A dashboard for live sports betting lines.

See project's GitHub

BUILD APPS WITH REAL-TIME DATA

A stock monitor using Redpandas & DX-Feed.

Read blog linked to code

DO AI IN REAL TIME

Dynamic unsupervised learning to detect fraud.

Read blog linked to code

SOURCE TICKING DATA FROM CUSTOM APIS

A framework for trading via Interactive Brokers.

See project's GitHub

INTEROPERATE WITH YOUR TOOLS

A PlugIn demo for matplotlib.

Watch the video

PULL DATA FROM REST APIS

An integration with Prometheus.

Read blog linked to code

SCALE UP


ENTERPRISE DEPLOYMENT

Deephaven Enterprise has been battle-tested inside the demanding environment of
hedge funds, stock exchanges and banks. Its collection of enterprise-ready tools
and exclusive add-ons helps your team scale up quickly and benefit from the
mutualization of enhancement requests. Professional services are available if
you’d like more hands on deck.




BATTERIES INCLUDED DATA MANAGEMENT

DATA MANAGEMENT

Systems for ingesting, storing and disseminating data focus on throughput and
efficiency. Utilities exist to support cleaning, validation, and transformation.
Sophisticated control systems limit user or team access to source and derived
data, by directory and table; as well as granularly by row or column key.


SCALE ACROSS 1000S OF CORES, PBS OF DATA, AND TBS OF STREAMS

QUERY & COMPUTE

The Deephaven Enterprise platform comprises the machinery, operations, and
workflows to develop and support applications and analytics at scale --
real-time and otherwise. It is readily deployed on commoditized cloud or
physical Linux resources using modern techniques. Ingest, storage, and compute
scale independently.


CREATE AND SHARE APPLICATIONS AND INTERACTIVE DASHBOARDS QUICKLY

UI & TOOLING

Deephaven Enterprise has premiere experiences in Jupyter, Excel, R-Studio and
classic IDE’s and its REPL, but it also includes a zero-time UX for launching,
scheduling, and monitoring applications. These feed dependent enterprise apps
and empower the quick configuration and sharing of real-time dashboards.



INTEGRATIONS


INTEGRATES WITH FAMILIAR AND POWERFUL TOOLS


Sign up for our monthly newsletter to get the latest Deephaven news
Subscribe
Email*

First name

Last name


Deephaven Data Labs needs the contact information you provide to us to contact
you about our products and services. You may unsubscribe from these
communications at any time. For information on how to unsubscribe, as well as
our privacy practices and commitment to protecting your privacy, please review
our Privacy Policy.


Community Core
 * Documentation
 * Open-core License
 * Barrage Docs

Enterprise
 * Enterprise Support
 * Documentation
 * Legacy Documentation
 * Ultimate Cheat Sheet

Social
 * Blog
 * Github
 * Slack
 * Linkedin
 * Twitter
 * Youtube

Company
 * About
 * Solutions
 * Careers
 * Newsroom
 * Brand Assets
 * Contact

Copyright © 2022 Deephaven Data Labs LLC
giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#