(testing signal)

Tag: Snowflake

Snowflake Adds Python Support with Winter Release

In a nod to the growing importance of data science and AI development on its platform, Snowflake today unveiled that its upcoming Winter Release will support for executing code written in Python, which is the most popular language in the world and also the number one language for developing machine learning models.
Support for Python is in private preview and is being added to Snowpark, Snowflake’s compute framework for automating computational workflows for data analytics, data science  and data engineering use cases. Snowflake launched Snowpark one year ago with support for Java…

How Snowflake is Providing AI Based Solutions to its Customers Hussle-Free?

AI is having been transforming across various companies and industries in many waysArtificial intelligence is driving discoveries and innovations across most industries these days. And several fields have also been revolutionized using AI and many are yet to be undergone. AI technology today is virtually limitless as its products are safe bets to grab a larger share of the marketplace.Snowflake Inc. is a cloud computing-based data warehousing company which is based in Montana. The company delivers the Data Cloud where thousands of organizations mobilize data with near-unlimited scale,…

In The Battle Between Databricks & Snowflake, Is Singlestore The Real Winner

In September this year, California-based SingleStore raised $80 million in primary capital funding in the Series F round. With this funding, the reported valuation of the company climbed to $940 million. Some of the high profile investors include Insight Partners, HPE, Khosla Ventures, Dell Technologies Capital, Rev IV, and GV (previously Google Ventures).
The last two years have been a very significant chapter in SingleStore’s growth story. The company had its Series E funding in 2020, where it managed to raise $80 million. This funding came at the heels of rebranding from MemSQL….

[DBT] Set Snowflake Query Tag for each DBT model [Tip-2]

Software EngineeringQuery Tag feature in DBT is a database-specific configuration. In this article, let see how to customize it for Snowflake.Query tags are a Snowflake parameter that can be quite useful later on when searching in the QUERY_HISTORY view.Default Query Tag can be set in the Profiles.ymlThis will be applied to all the queries triggered from DBT to Snowflake.3 ways to configure query tagConfigure it in dbt_project.yml, the drawback in this approach is we can only add string, can’t execute a macro with the current dbt version. For now, DBT supports only a few lists of…

How to Transform Your Data in Snowflake

Data transformation is the biggest bottleneck in the analytics workflow. The modern approach to data pipelines is ELT, or extract, transform, and load, with data transformation performed in your Snowflake data warehouse. A new breed of “no-/low-code” data transformation tools, such as Datameer, are emerging to allow the wider analytics community to transform data on their own, eliminating analytics bottlenecks.

How to Migrate Your Data from Redshift to Snowflake

Get your data out of Redshift and into Snowflake easily with open source data integrationImage by authorFor decades, data warehousing solutions have been the backbone of enterprise reporting and business intelligence. But, in recent years, cloud-based data warehouses like Amazon Redshift and Snowflake have become extremely popular. So, why would someone want to migrate from one cloud-based data warehouse to another?The answer is simple: More scale and flexibility. With Snowflake, users can…

The Basics of Hash Table

Let’s take a look at an example of a hash function.

function int hashCode(String s) {
return s[0]*31^(n-1) + s[1]*31^(n-2) + ... + s[n-1]

This is the hash function used by Java String object, where s is the array of characters of the String object, e.g. s[0] is the first character, and n is the length of the String object. Let’s see it in action.

hashCode("Apple") // 63476538
hashCode("Costco") // 2024204569
hashCode("Ikea") // 2280798
hashCode("Snowflake") // 1973786418

Notice from the few examples ☝️ that the hash function outputs different hash code values for each String object. As mentioned earlier, each key is expected to be unique, so the hash function is also expected to produce unique hash code for each key.


Is your data strategy missing the “Mark”?

The benefits realized by any and every data initiative will be coupled to and limited by the maturity of an organization’s information literacy.


Introducing the DataRobot AI Cloud: A Closer Look

Since I joined DataRobot nearly two years ago, I’ve been fortunate to spend much of my time meeting with and learning from our users and customers. Time and time again, we hear about the need for AI to support cross-functional teams and users. To provide the ability to integrate diverse data sources. To offer the flexibility to deploy AI solutions anywhere. To support the need to connect AI-driven decisions directly with existing business applications and services, like Snowflake, Salesforce, and ServiceNow. Most critically, to unify the ability to do it all in a single environment.

Today, I’m proud to announce the launch of the DataRobot AI Cloud: our first-of-its-kind, integrated, end-to-end platform delivering clear and powerful predictions to power business decisions for all organizations. … Read more...

Building Machine Learning Pipelines using Snowflake and Dask

By Daniel Foley, Data Scientist


Recently I have been trying to find better ways to improve my workflow as a data scientist. I tend to spend a decent chunk of my time modelling and building ETLs in my job. This has meant that more and more I need to rely on tools to reliably and efficiently handle large datasets. I quickly realised that using pandas for manipulating these datasets is not always a good approach and this prompted me to look into other alternatives.

In this post, I want to share some of the tools that I have been exploring recently and show you how I use them and how they helped improve the efficiency of my workflow. The two I will talk about in particular are Snowflake and Dask. Two very different tools but ones that complement each other well especially as part of the ML Lifecycle.