DEA-C01 examcollection are totally changed by SnowFlake. Download from killexams.com today

killexams.com give Latest and 2022 refreshed DEA-C01 real questions with real questions Questions and Answers for new points. Practice our DEA-C01 examcollection Questions and questions and answers to Improve your insight and finish your DEA-C01 test with High Marks. We ensure your accomplishment in the Test Center, covering every last one of the motivations behind test and foster your Knowledge of the DEA-C01 test. Pass without question with our real issues.

DEA-C01 SnowPro Advanced Data Engineer test | http://babelouedstory.com/

DEA-C01 test - SnowPro Advanced Data Engineer Updated: 2024

killexams.com DEA-C01 braindumps question bank
Exam Code: DEA-C01 SnowPro Advanced Data Engineer test January 2024 by Killexams.com team
SnowPro Advanced Data Engineer
SnowFlake Advanced test

Other SnowFlake exams

DEA-C01 SnowPro Advanced Data Engineer
DSA-C02 SnowPro Advanced: Data Scientist
ARA-C01 SnowPro Advanced Architect Certification
COF-R02 Snowflake SnowPro Core Recertification

It is rather time consuming and costly activity to use free DEA-C01 dumps and take the official DEA-C01 test. You should obtain killexams.com updated and valid DEA-C01 test dumps that includes real test Dumps with vce test simulator made of DEA-C01 braindumps. No more waste of money and time. Just take DEA-C01 test and pass.
Question: 26
Which Scenario Data engineer decide Materialized views are not useful. Select All that apply.
A. Query results contain a small number of rows and/or columns relative to the base table (the table on which the view
is defined).
B. Query results contain results that require significant processing.
C. The query is on an external table (i.e. data sets stored in files in an external stage), which might have slower
performance compared to querying native database tables.
D. The views base table change frequently.
wrong
Answer: A
Explanation:
Materialized views are designed to store computed results of a query which can then be referenced without
recomputing the original expression. This is useful when a query is computationally expensive. However, there are
scenarios where using materialized views might not be beneficial. Let's examine each of the options:
A. Query results contain a small number of rows and/or columns relative to the base table (the table on which the view
is defined).
Correct. If the query returns a small result set, the performance benefit from a materialized view would be minimal, if
any. In such cases, the overhead of maintaining the materialized view (especially when the underlying data changes)
could outweigh the benefits.
B. Query results contain results that require significant processing.
Incorrect. This is a scenario where materialized views can be useful. If a query requires a lot of processing, storing the
result in a materialized view can Improve subsequent query performance by eliminating the need to redo the same
heavy computations.
C. The query is on an external table (i.e. data sets stored in files in an external stage), which might have slower
performance compared to querying native database tables.
Correct, at least in the context of Snowflake as of the last update in January 2022. Snowflake doesn't support
materialized views on external tables. So, in this scenario, materialized views would not be applicable.
$13$10
D. The view s base table change frequently.
Correct. If the base table is updated frequently, the materialized view would need to be refreshed often. Continuously
refreshing the materialized view can introduce performance overhead and could negate the advantages of having a
materialized view in the first place.
So, the correct scenarios where a data engineer might decide that materialized views are not useful are:
A. Query results contain a small number of rows and/or columns relative to the base table.
C. The query is on an external table.
D. The view s base table change frequently.
Question: 27
Partition columns optimize query performance by pruning out the data files that do not need to be scanned (i.e.
partitioning the external table).
Which pseudocolumn of External table evaluate as an expression that parses the path and/or filename information.
A. METADATA$ROW_NUMBER
B. METADATA$COLUMNNAME
C. METADATA$FILEPATH
D. METADATA$FILENAME
Answer: C
Explanation:
The pseudocolumn of an external table that evaluates as an expression parsing the path and/or filename information is:
C. METADATA$FILEPATH
This pseudocolumn provides the full path (including the filename) of the file from which a particular row originated.
This can be especially useful when using partitioning based on directory or file structures in your external stage.
Question: 28
Data Engineer identified use case where he decided to use materialized view for query performance.
Which one is not the limitation he must be aware of before using MVs in their use case?
A. Truncating a materialized view is not supported.
B. Time Travel is not currently supported on materialized views.
C. You cannot directly clone a materialized view by using the CREATE MATERIAL-IZED VIE
D. .. CLON
E. .. command.
F. A materialized view can query only a single table & Joins, including self-joins, are not supported.
G. A materialized views does not support clustering.
$13$10
H. A materialized views cannot be created on Shared Data.
I. A materialized view cannot include HAVING clauses OR ORDER BY clause.
J. Context Functions like CURRENT_TIME or CURRENT_TIMESTAMP is not per-mitted.
Answer: D
Explanation:
Among the options provided, the incorrect statement (and thus not a limitation) is:
D. A materialized view can query only a single table & Joins, including self-joins, are not supported.
Materialized views in Snowflake can indeed be based on the result of a query that includes JOIN operations on one or
more tables.
Question: 29
group by m.item_id;
Step 3: After 1 hour, he decided to temporarily suspend the use (and maintenance) of the DataReportMV materialized
view for cost saving purpose.
alter materialized view DataReportMV suspend;
Please select what Alex is doing wrong here?
A. A materialized view, DataReportMV does not support Join operations, so Step 2 would be failed & he cannot
proceed further.
B. Materialized view on top of External tables is not supported feature.
C. Once DataReportMV got suspended, any query on the top of the view will generate error like: Failure during
expansion of view 'DATAREPORTMV': SQL compilation error: Materialized view DataReportMV is invalid.
D. There is no command like suspend for temporarily suspension of Materialized views, Step 3 will
give error like invalid Suspend command.
E. Alex is doing everything correct.
Answer: A,B,D
Explanation:
B. As of my last training data, Snowflake does not support creating materialized views on external tables. Materialized
views are typically built on native Snowflake tables or views.
D. There's no "suspend" command for materialized views in Snowflake. Materialized views in Snowflake
automatically stay up-to-date as the underlying data changes without the need to manually refresh, so there's no need
for a "suspend" operation.
Question: 30
David, a Lead Data engineer with XYZ company looking out to Improve query performance & other benefits while
working with Tables, Regular Views, MVs and Cached Results.
$13$10
Which one of the following does not shows key similarities and differences between tables, regular views, cached
query results, and materialized views while choosing any of them by David?
A. Regular views do not cache data, and therefore cannot Improve performance by caching.
B. As with non-materialized views, a materialized view automatically inherits the privileges of its base table.
C. Cached Query Results: Used only if data has not changed and if query only uses deterministic functions (e.g. not
CURRENT_DATE).
D. Materialized views are faster than tables because of their cache (i.e. the query results for the
view); in addition, if data has changed, they can use their cache for data that hasnt changed and use the base table
for any data that has changed.
E. Both materialized views and regular views enhance data security by allowing data to be exposed or hidden at the
row level or column level.
Answer: B
Explanation:
Materialized Views, like other database objects (tables, views, UDFs, etc.), are owned by a role and have privileges
that can be granted to other roles.
You can grant the following privileges on a materialized view:
SELECT
As with non-materialized views, a materialized view does not automatically inherit the privileges of its base table.
You should explicitly grant privileges on the materialized view to the roles that should use that view.
As with non-materialized views, a user who wishes to access a materialized view needs privileges only on the view,
not on the underlying object(s) that the view references. Rest is correct.
Question: 31
Melissa, Senior Data Engineer, looking out to optimize query performance for one of the Critical Control Dashboard,
she found that most of the searches by the users on the control dashboards are based on Equality search on all the
underlying columns mostly.
Which Best techniques she should consider here?
A. She can go for clustering on underlying tables which can speedup Equality searches.
B. A materialized view speeds both equality searches and range searches.
C. The search optimization service would best fit here as it can be applied to all underlying columns & speeds up
equality searches. (Correct)
D. Melissa can create Indexes & Hints on the searchable columns to speed up Equality search.
Answer: C
Explanation:
Clustering a table can speed any of the following, as long as they are on the clustering key:
Range searches.
$13$10
Equality searches.
However, a table can be clustered on only a single key (which can contain one or more columns or expressions).
The search optimization service speeds equality searches. However, this applies to all the columns of supported types
in a table that has search optimization enabled. This is what required here& best fit for purpose.
A materialized view speeds both equality searches and range searches, as well as some sort opera-tions, but only for
the subset of rows and columns included in the materialized view.
Question: 32
Search optimization works best to Improve the performance of a query when the following conditions are true: [Select
All that apply]
A. The table is not clustered.
B. The table is frequently queried on columns other than the primary cluster key.
C. Search Query uses Equality predicates (for example, = ) OR Predicates that use I
D. Search Query uses Sort Operations.
Answer: A,B,C
Explanation:
Materialized Views works best for search query performance in case of Sort Operations. For Rest of the points Search
optimization works best to Improve query performance.
Question: 33
Regular views do not cache data, and therefore cannot Improve performance by caching?
A. TRUE
B. FALSE
Answer: A
Explanation:
Regular views do not cache data, and therefore cannot Improve performance by caching.
Question: 34
Mark the correct statements about Cache?
A. Materialized views are more flexible than, but typically slower than, cached results.
B. Materialized views are faster than tables because of their cache (i.e. the query results for the view); in addition, if
data has changed, they can use their cache for data that hasnt changed and use the base table for any data that has
$13$10
changed.
C. For persisted query results of all sizes, the cache expires after 24 hours.
D. The size of the warehouse cache is determined by the compute resources in the ware-house.
E. Warehouse cache is dropped when the warehouse is suspended, which may result in slower initial performance for
some queries after the warehouse is resumed.
Answer: A,B,C,D,E
Explanation:
How Does Warehouse Caching Impact Queries?
Each warehouse, when running, maintains a cache of table data accessed as queries are processed by the warehouse.
This enables improved performance for subsequent queries if they are able to read from the cache instead of from the
table(s) in the query. The size of the cache is determined by the compute resources in the warehouse (i.e. the larger the
warehouse and, therefore, more compute re-sources in the warehouse), the larger the cache.
This cache is dropped when the warehouse is suspended, which may result in slower initial performance for some
queries after the warehouse is resumed. As the resumed warehouse runs and processes more queries, the cache is
rebuilt, and queries that are able to take advantage of the cache will experience improved performance.
Keep this in mind when deciding whether to suspend a warehouse or leave it running. In other words, consider the
trade-off between saving credits by suspending a warehouse versus maintaining the cache of data from previous
queries to help with performance. Using Persisted Query Results
When a query is executed, the result is persisted (i.e. cached) for a period of time. At the end of the time period, the
result is purged from the system.
Snowflake uses persisted query results to avoid re-generating results when nothing has changed (i.e. retrieval
optimization). In addition, you can use persisted query results to post-process the results (e.g. layering a new query on
top of the results already calculated). For persisted query results of all sizes, the cache expires after 24 hours.
Both materialized views and cached query results provide query performance benefits:
Materialized views are more flexible than, but typically slower than, cached results.
Materialized views are faster than tables because of their cache (i.e. the query results for the view); in addition, if
data has changed, they can use their cache for data that hasnt changed and use the base table for any data that has
changed.
Regular views do not cache data, and therefore cannot Improve performance by caching.
Question: 35
Marko, a Data Engineer is using Snowpipe for data loading in micro batches for one of the Finance Data workloads.
There are set of files he attempted to load into the snowflake table using Snow-pipe. While monitoring he found that
there are set of files has multiple issue, He queried the COPY_HISTORY view & checked the STATUS column which
indicates whether a particular set of files was loaded, partially loaded, or failed to load.
But he wants to view all errors in the files along with Load status, how he can check all errors?
A. He can check RETURN_ALL_ERROR_MESSAGE column in the COPY_HISTORY view which can provides a
reason and view all errors in the files.
$13$10
B. He can view all errors in the files, by executing a COPY INTO statement with the
VALIDATION_ERROR_MODE copy option set to RE-TURN_ALL_PIPE_ERRORS.
C. Marko can look out for FIRST_ERROR_MESSAGE column in the COPY_HISTORY view which can provides a
reason why a file partially loaded or failed for all the files.
D. He can view all errors in the files, by executing a COPY INTO
statement with the VALIDATION_MODE
copy option set to RETURN_ALL_ERRORS.
wrong
Answer: A,B,C,D,E
Explanation:
To view details about the errors when loading data into Snowflake, Marko can refer to specific columns in the relevant
views. Here's how each option fares:
A. He can check RETURN_ALL_ERROR_MESSAGE column in the COPY_HISTORY view which can provides a
reason and view all errors in the files.
Incorrect. There isn't a column named RETURN_ALL_ERROR_MESSAGE in the COPY_HISTORY view in
Snowflake.
B. He can view all errors in the files, by executing a COPY INTO
statement with the
VALIDATION_ERROR_MODE copy option set to RETURN_ALL_PIPE_ERRORS.
Incorrect. There isn't a VALIDATION_ERROR_MODE copy option or RETURN_ALL_PIPE_ERRORS value in
Snowflake's COPY INTO statement.
C. Marko can look out for FIRST_ERROR_MESSAGE column in the COPY_HISTORY view which can provides a
reason why a file partially loaded or failed for all the files.
Correct. The FIRST_ERROR_MESSAGE column in the COPY_HISTORY view provides the first error message (if
any) for a file that partially loaded or failed.
D. He can view all errors in the files, by executing a COPY INTO
statement with the VALIDATION_MODE
copy option set to RETURN_ALL_ERRORS.
Incorrect. Snowflake's COPY INTO command does not have a VALIDATION_MODE option or a
RETURN_ALL_ERRORS value.
Therefore, the correct answer is:
C. Marko can look out for FIRST_ERROR_MESSAGE column in the COPY_HISTORY view which can provide a
reason why a file partially loaded or failed for all the files.
Question: 36
Robert, A Data Engineer, found that Pipe become stale as it was paused for longer than the limited retention period for
event messages received for the pipe (14 days by default) & also the previous pipe owner transfers the ownership of
this pipe to Robert role while the pipe was paused.
$13$10
How Robert in this case, Resume this stale pipe?
A. PIPE needs to recreate in this scenario, as pipe already past 14 days of period & stale.
B. He can apply System function SYSTEM$PIPE_STALE_RESUME with ALTER PIPE statement.
C. Robert can use SYSTEM$PIPE_FORCE_RESUME function to resume this stale pipe.
D. select sys-tem$pipe_force_resume ('mydb.myschema.stalepipe','staleness_check_override,
ownership_transfer_check_override');
E. ALTER PIPE
F. .. RESUME statement will resume the pipe.
Answer: D
Explanation:
When a pipe is paused, event messages received for the pipe enter a limited retention period. The period is 14 days by
default. If a pipe is paused for longer than 14 days, it is considered stale.
To resume a stale pipe, a qualified role must call the SYSTEM$PIPE_FORCE_RESUME function and input the
STALENESS_CHECK_OVERRIDE argument. This argument indicates an under-standing that the role is resuming a
stale pipe.
For example, resume the stale stalepipe1 pipe in the mydb.myschema database and schema: SELECT SYS-
TEM$PIPE_FORCE_RESUME('mydb.myschema.stalepipe1','staleness_check_override'); While the stale pipe was
paused, if ownership of the pipe was transferred to another role, then resuming the pipe requires the additional
OWNERSHIP_TRANSFER_CHECK_OVERRIDE argument. For example, resume the stale stalepipe2 pipe in the
mydb.myschema database and schema, which transferred to a new role:
SELECT SYS-TEM$PIPE_FORCE_RESUME('mydb.myschema.stalepipe1','staleness_check_override, own-
ership_transfer_check_override');
Question: 37
How Data Engineer can do Monitoring of Files which are Staged Internally during Continuous data pipelines loading
process? [Select all that apply]
A. She Can Monitor the files using Metadata maintained by Snowflake i.e. file-name,last_modified date etc.
B. Snowflake retains historical data for COPY INTO commands executed within the previous 14 days.
C. She can Monitor the status of each COPY INTO
command on the History tab page of the classic web
interface.
D. She can use the DATA_LOAD_HISTORY Information Schema view to retrieve the history of data loaded into
tables using the COPY INTO command.
E. She can use the DATA_VALIDATE function to validate the data files She have loaded and can retrieve any errors
encountered during the load.
wrong
Answer: D
$13$10
Explanation:
Let's evaluate each option:
A. She Can Monitor the files using Metadata maintained by Snowflake i.e. file-name, last_modified date, etc.
Correct. Snowflake does maintain metadata about staged files, which can be queried.
B. Snowflake retains historical data for COPY INTO commands executed within the previous 14 days.
Correct. Snowflake does retain the history of the COPY INTO commands executed, which provides visibility into the
data-loading operations for the specified time frame.
C. She can Monitor the status of each COPY INTO
command on the History tab page of the classic web
interface.
Correct. The Snowflake web interface provides a History tab where users can monitor and review past queries,
including COPY INTO commands.
D. She can use the DATA_LOAD_HISTORY Information Schema view to retrieve the history of data loaded into
tables using the COPY INTO command.
Correct. The DATA_LOAD_HISTORY view in the Information Schema is explicitly meant to provide a history of
data-loading operations in Snowflake.
E. She can use the DATA_VALIDATE function to validate the data files She has loaded and can retrieve any errors
encountered during the load.
Incorrect. There isn't a DATA_VALIDATE function in Snowflake for this purpose. To validate data and retrieve
errors, you'd generally look at the results of the COPY INTO command or use other views like COPY_HISTORY.
Based on the above evaluation, the correct options are:
A, B, C, and D.
Question: 38
To help manage STAGE storage costs, Data engineer recommended to monitor stage files and re-move them from the
stages once the data has been loaded and the files which are no longer needed.
Which option he can choose to remove these files either during data loading or afterwards?
A. He can choose to remove stage files during data loading (using the COPY INTO
command).
B. Files no longer needed, can be removed using the PURGE=TRUE command.
C. Files no longer needed, can be removed using the REMOVE command.
D. Script can be used during data loading & post data loading with DELETE command.
wrong
$13$10
Answer: D
Explanation:
Let's evaluate each option:
A. He can choose to remove stage files during data loading (using the COPY INTO
command).
Correct. Using the AUTO_DELETE=TRUE option with the COPY INTO command will remove the files from the
stage once they've been successfully loaded.
B. Files no longer needed, can be removed using the PURGE=TRUE command.
Incorrect. In Snowflake, there isn't a PURGE=TRUE command.
C. Files no longer needed, can be removed using the REMOVE command.
Correct. The REMOVE command can be used on a stage to remove specific files or patterns of files that are no longer
needed.
D. Script can be used during data loading & post data loading with DELETE command.
Incorrect. In Snowflake, you don't use the DELETE command to remove files from stages. You'd use the REMOVE
command.
Based on the above evaluation, the correct options are:
A and C.
Question: 39
Snowflake does not provide which of following set of SQL functions to support retrieving information about tasks?
A. SYSTEM$CURRENT_USER_TASK_NAME
B. TASK_HISTORY
C. TASK_DEPENDENTS
D. TASK_QUERY_HISTORY
E. SYSTEM$TASK_DEPENDENTS_ENABLE
Answer: C
Explanation:
SYSTEM$CURRENT_USER_TASK_NAME
Returns the name of the task currently executing when invoked from the statement or stored procedure defined by the
task.
SYSTEM$TASK_DEPENDENTS_ENABLE
Recursively resumes all dependent tasks tied to a specified root task.
TASK_DEPENDENTS
$13$10
This table function returns the list of child tasks for a given root task in a DAG of tasks.
TASK_HISTORY
This table function can be used to query the history of task usage within a specified date range.
Question: 40
SYSTEM$CLUSTERING_INFORMATION functions returns clustering information, including average clustering
depth, for a table based on one or more columns in the table. The function returns a JSON object containing
average_overlaps name/value pairs.
Does High average_overlaps indicates well organized Clustering?
A. YES
B. NO
Answer: B
Explanation:
Higher the avg_overlap indicates poorly organized clustering.
Question: 41
The smaller the average depth, the better clustered the table is with regards to the specified column?
A. TRUE
B. FALSE
Answer: A
Question: 42
Data Engineer, ran the below clustering depth analysis function:
select system$clustering_depth('TPCH_CUSTOMERS', '(C1, C6)', 'C9 = 30'); on TPCH_CUSTOMERS table, will
return which of the following?
A. An error: this function does not accept lists of columns as a third parameter.
B. An error: this function does not accept predicates ('C9 = 30') as parameter.
C. Calculate the clustering depth for a table using mentioned columns in the table.
D. Calculate the clustering depth for a table using the clustering key defined for the table.
Answer: B
Question: 43
Mark the Correct Statements:
$13$10
Statement 1. Snowflakes zero-copy cloning feature provides a convenient way to quickly take a snapshot of any
table, schema, or database.
Statement 2. Data Engineer can use zero-copy cloning feature for creating instant backups that do not incur any
additional costs (until changes are made to the cloned object).
A. Statement 1
B. Statement 2
C. Both are False.
D. Statement 1 & 2 are correct.
Answer: D
Explanation:
Statement 1 is accurate. Snowflake's zero-copy cloning feature allows users to create a clone of a table, schema, or
database without copying the underlying data, making it an efficient method to produce a "snapshot" of the data.
Statement 2 is also accurate. Since zero-copy cloning doesn't duplicate the actual data until changes are made to the
clone, you don't incur additional storage costs immediately after cloning. Instead, you start incurring costs only when
there's a divergence between the clone and the original, due to Snowflake's unique data sharing and metadata tracking
capabilities.
Therefore, the correct answer is:
D. Statement 1 & 2 are correct.
Question: 44
Clones can be cloned, with no limitations on the number or iterations of clones that can be created (e.g. you can create
a clone of a clone of a clone, and so on), which results in a n-level hierarchy of cloned objects, each with their own
portion of shared and independent data storage?
A. TRUE
B. FALSE
Answer: A
$13$10

SnowFlake Advanced test - BingNews https://killexams.com/pass4sure/exam-detail/DEA-C01 Search results SnowFlake Advanced test - BingNews https://killexams.com/pass4sure/exam-detail/DEA-C01 https://killexams.com/exam_list/SnowFlake Snowflake Stock Fell Today -- Is It a Buy for 2024? No result found, try new keyword!Even with today's sell-off, Snowflake stock is still up roughly 32% over the last year. The company's share price has seen solid gains across the stretch, but it's also lagged far behind some other ... Tue, 02 Jan 2024 09:30:00 -0600 en-us text/html https://www.msn.com/ University of Utah tracks motion of falling snowflakes

04 Jan 2024

Laser light sheet and SLR camera scan the air to map lanes where snow is glistening.

A project at the University of Utah has studied the motion of snowflakes moving through turbulent air as they fall to the ground, a complex and dynamic environment.

As well as a better understanding of how snow cover gathers at ground level, the findings could lead to new data about snowflake fall speed, a parameter of interest for predicting weather patterns and assessing climate change.

"Even in the tropics, precipitation often starts its lifetime as snow," commented Timothy Garrett from Utah's Aerosol Cloud Climate Systems Group.

"How fast precipitation falls greatly affects storm lifetimes and trajectories and the extent of cloud cover that may amplify or diminish climate change. Just small tweaks in model representations of snowflake fall speed can have important impacts on both storm forecasting and how fast climate can be expected to warm for a given level of elevated greenhouse gas concentrations."

The research, published in Physics of Fluids, employed a novel experimental set-up designed to obtain both the vertical velocity and acceleration statistics of individual snowflakes settling in atmospheric surface-layer turbulence, parameters which have previously proven challenging to measure.

In 2021 the Utah group developed the differential emissivity imaging disdrometer (DEID), an instrument designed to measure the mass, size and density of falling snowflakes, or "hydrometeors." This device observes a flake falling onto a heated metal plate with an infrared camera, imaging the flake's spatial dimensions before it melts and then calculating its mass via the loss of heat from the hotplate.

The DEID is now commercialised by a Utah spin-out, Particle Flux Analytics, which also markets the multi-angle snowflake camera (MASC) taking 10 to 30 micron resolution photographs of hydrometeors from three angles; and SnowPixel, a thermodynamic sensor array designed to assess snowflake precipitation as part of meteorological sensor networks.

Particle tracking finds simplicity within complex patterns

The new study built on the previous use of DEID by positioning the instrument directly beneath a particle tracking system. This used a laser light sheet and a single-lens reflex camera to follow the path of flakes as they crossed the light sheet on their way down.

To test the system the project spent the winter of 2021-22 at Alta, the snowiest place in Utah, where nature delivered 900 inches of snow for it to study.

Despite the intricate shapes of snowflakes and the uneven air movements they encounter, the researchers found they could predict how snowflakes would accelerate based on the Stokes number, a flow parameter reflecting how quickly the particles respond to changes in the surrounding air movements.

The same mathematical pattern was also involved in how different snowflake shapes fall at different rates, suggesting a fundamental connection between the way the air moves and how snowflakes change as they descend from the clouds to the ground.

“Snowflakes are complicated and turbulence is irregular, so the simplicity of the problem is actually quite mysterious," said Garrett.

"There is something deeper going on in the atmosphere that leads to mathematical simplicity rather than the extraordinary complexity we would expect from looking at complicated snowflake structures swirling chaotically in turbulent air. We just have to look at it the right way, and our new instruments enable us to see that."

University of Utah video

Wed, 03 Jan 2024 19:42:00 -0600 text/html https://optics.org/news/15/1/5
Snowflake bucks downgrade even as analyst says AI talk is all talk (update) No result found, try new keyword!Snowflake (SNOW) fell on Thursday amid a downgrade from Monness, Crespi, Hardt, which called the stock overvalued and dismissed its AI talk. Read for more. Thu, 04 Jan 2024 04:06:38 -0600 en-us text/html https://www.msn.com/ This Advanced Gut Health Test Will Tailor a Specific Plan to Improve Your Gut Microbiome

As you’ve probably noticed, gut health is all the rage among health bloggers and reporters. Punch “improve gut health” into Google and you’ll get dozens of articles and blog posts with titles like “Improve Your Gut Health in 7 Easy Steps” and “15 Things You Should Eat for a Healthy Gut.” Unfortunately, while the science behind gut health is real, it’s unlikely that any of these tips and tricks and recommended foods will do anything for you, because just like snowflakes, no two microbiomes are exactly alike. So if you want a diet that promotes gut health, it has to be a diet designed specifically for your gut. And the only way to get that is with a cutting-edge gut health test like this advanced gut microbiome test from Viome.

The human microbiome is the complex ecosystem of microorganisms that live in and on the human body. Over the last 15 years, scientists have discovered that this ecosystem plays a huge role in our overall health. And of particular importance is the gut microbiome, which is the community of symbiotic bacteria and other microorganisms that live in your digestive tract.

When your gut microbiome is out of balance, your body doesn’t absorb nutrients the way it should. This results in inflammation, which scientists now realize is the root of almost every chronic disease. recent research has shown that gut microbiome can affect almost every system in the body. Some studies even link gut health to specific diseases and conditions, including diabetes, obesity, irritable bowel syndrome, and colon cancer.

Unfortunately, as with many other health trends, most of the actual science pertaining to the gut microbiome has been lost in a sea of so-called probiotics and gut health products that are only designed to cash in on our desire to be healthy. In reality, if you want to take advantage of the recent advances human microbiomics to lose weight and Improve your overall health, you need a microbiome test to figure out exactly what is going on in your digestive tract. And that brings us to Viome's advanced gut health test.

Viome

A microbiome test is sort of like a DNA test. Instead of mapping out genes that are linked to specific health issues, think of it as a gut health test that maps out the microorganisms in your gut microbiome . And Viome's Gut Intelligent Test is the most advanced microbiome test in the world. It uses advanced metatranscriptomic sequencing technology developed at the Los Alomos National Laboratory to identify and quantify the microorganisms in your gut. Then it analyzes what nutrients and toxins these organisms are producing in order to provide you with personalized nutrition recommendations.

Once you place your order, Viome will send you an easy to use at-home kit to collect your sample. After you return the kit by mail, Viome analyzes it with its proprietary microbe identification technology. They then use an advanced artificial intelligence algorithm to create customized dietary recommendations.

These recommendations are designed to increase microbial species associated with overall wellness; minimize microbial species associated with poor health; create the ideal ratio of proteins, carbohydrates, and fats for your diet; and encourage foods that are most compatible with your metabolism. And all recommendations come with detailed explanations and are delivered straight to your mobile device via the intuitive Viome app.

So if you’re ready to get serious about gut health, don’t waste your time or money on gimmicky supplements and beverages. Trust science and get your microbiome tested with a gut health test from Viome.

Futurism fans: To create this content, a non-editorial team worked with an affiliate partner. We may collect a small commission on items purchased through this page. This post does not necessarily reflect the views or the endorsement of the Futurism.com editorial staff.


Tue, 19 Dec 2023 09:59:00 -0600 text/html https://futurism.com/neoscope/viome-gut-health-test-microbiome
Better Cloud Stock: Snowflake vs. Oracle No result found, try new keyword!Snowflake (NYSE: SNOW) and Oracle (NYSE: ORCL) represent two different ways to invest in the growing cloud market. Snowflake helps companies clean up and aggregate large amounts of data for third ... Fri, 15 Dec 2023 07:30:00 -0600 https://www.nasdaq.com/articles/better-cloud-stock:-snowflake-vs.-oracle Snowflake Stock Beats Investor Expectations On Growth From Generative AI

Snowflake — a mobilizer of enterprise data through a company’s data cloud — exceeded investor expectations in its most recent financial report, according to MarketWatch.

While Snowflake’s shares closed December 5 at $186 a share — 52% below their November 2021 high — the company’s November 29 third quarter report seems to have boosted its stock price by 8%.

Will Snowflake continue to beat expectations? Based on my recent interviews with company executives, a Goldman Sachs analyst, and a Snowflake customer — Freddie Mac’s chief data officer — the data platform provider has several factors working in its favor, including:

  • More spending on Snowflake’s services.
  • High return on customer investment.
  • Building new services to satisfy new customer needs.

Snowflake faces several risks — including competition from hyperscalers like AWS and others, the possible failure of its compliance procedures, and slower than expected migration to the cloud, according to Morningstar.

However, the company appears to be riding a cresting wave of demand and is well-positioned to capture a significant share.

Snowflake’s Third Quarter 2023 Performance and Prospects

Snowflake stock rose 7.6% after the company reported expectations-beating results on November 29.

Here are the key numbers from MarketWatch:

  • Q3 product revenue: $698 million, up 34% and $19 million above analyst estimates.
  • Q3 Adjusted earnings per share: 25 cents — up 127% and nine cents per share above estimates.
  • Q3 net revenue retention rate: 135%.
  • Q3 customers with over $1 million in revenue: 436.
  • Q4 revenue guidance: a range between $716 million and $721 million — the midpoint of which is about $22.5 million more than estimates.

“These results reflect strong execution in a broadly stabilizing macro environment,” CEO Frank Slootman said in a release.

Snowflake enjoyed increased demand in the third quarter. In a December 4 interview, Slootman told me, “We grew revenue 34% in the latest quarter. The big takeaway to read into it is the markets have stabilized.”

He added that in “Q4 22, Q1 23, and Q2 23 we saw disruption as people focused on rationalization and getting rid of stuff. We reset guidance based on our consumption model. The financial world does not love the consumption model on the way down. In Q3 we maintained our guidance. We see continued strength and raised our Q4 guidance.”

More Spending On Snowflake’s Services

Due to Snowflake’s consumption model — where customers pay for what they use, rather than a fixed per person monthly fee — the company’s revenues fluctuate with demand.

Despite push back from investors, Snowflake sees the consumption model as the wave of the future. “The consumption model is positive for the entire sector,” Slootman said.

“I was previously CEO of ServiceNow NOW which has a traditional subscription model,” he added. “I thought the model was unfair to customers to charge them on a per user per month basis whether they used it or not.”

Investors like the predictability of the traditional SaaS model but customers have mixed feelings about it. As he told me, “The consumption model is great for customers on the way down — since they do not pay for what they do not use. However, on the way up, they pay more to use more and they don’t like it.”

Earlier in 2023, customers were consuming less. Now they are increasing their spending. “We came out of a period of irrational exuberance during the pandemic. There was over caffeinated spending,” he said. “The bill came due and companies rationalized and optimized in reaction to the prior several years.”

By diversifying its customer base, Snowflake is less likely to experience such future fluctuations. “Over the last four or five years, our business was mostly digital natives,” Slootman told me.

He added, “Since then we have layered in traditional companies — such as banks, manufacturers, health care, and retailers. We are more stable now. It won’t happen again.”

High Return On Customer Investment

Snowflake customers say they get value from the company’s products. I have previously written about how Goldman Sachs and State Street STT have benefited. In a nutshell, they value getting faster access to the data they need to make better decisions.

Freddie Mac is getting faster answers with Snowflake. On December 5, Aravind Jagannathan, vice president chief data officer at the company, told me, “In 2019 we wanted to move our data from on-premises to the cloud. We wanted to be quicker in providing insights from the data while reducing the risk of having multiple copies of our data moving back and forth.”

Freddie Mac picked Snowflake for its faster speed to market and better query performance. “A report that used to take until 3 pm to deliver arrived at 8 am,” Jagannathan said. He added, “Processing that used to take 12 hours is now done in 35 minutes. A capital report that used to take many hours can be done in 10 minutes.”

Freddie Mac will take its time on using Generative AI. “We are being thoughtful about regulators and senior management,” he said. “Snowflake can support Generative AI. We are going to make sure we have the right data governance and have identified the right use cases.”

New Services To Satisfy New Customer Needs

The world changes fast and companies can only keep growing if they position themselves to take advantage of those changes in ways that benefit customers.

Snowflake’s technology has evolved in the direction of greater flexibility for customers. “We used to offer data warehousing for a very specific use case of making batch processing data available for analytics,” Slootman said.

He continued, “We have added many different kinds of workloads. Now we have a platform based on all kinds of live data. Now the work is going to the data. It is easy to inquire about what is happening now in the business. We are working towards enabling users to ask ‘What will happen next quarter? Why? Will it happen again? What should we do about it?’”

Generative AI will contribute to realizing Slootman’s vision for a forward-looking decision support system. According to my December 4 interview with Sridhar Ramaswamy, Snowflake’s senior vice president of AI, “Fidelity uses Snowflake as its platform of record for all data. If they want a global picture it can be done easily with live data through Snowflake.” (Ramaswamy previously led Neeva, a search service powered by AI without advertisements, that Snowflake acquired in May 2023).

To that end, it sounds to me as though Snowflake is cautiously building a way for end-users to make natural language queries of their data. “2023 is the year of AI,” Ramaswamy said. “First we have the ability to synthesize information in a fluid conversation through ChatGPT using Neeva.”

Snowflake is developing this vision in stages. First, search can be done through frequently asked questions. Next, the company is offering CoPilot to make it easier for people to write SQL code to provide faster access to the data. “We are building out the capability to query data in real time,” he concluded.

Like Freddie Mac, Snowflake is not going to offer Generative AI until it is safe. “We are building business applications and we want it to be rock solid with no hallucinations,” Slootman said.

He added, “AI is bridging the gap between man and machine. We are working on the last mile. We will not launch until security and governance are solid and we provide links to know where the data came from. We will have auditability and verifiability.”

Where Will Snowflake Stock Go?

If Generative AI works safely and effectively, more people will use Snowflake to access corporate data — which would increase the company’s revenue and presumably its stock price.

How much revenue will this add? “It is not there yet to quantify. It is bigger than a breadbox. It will become massive. It will pay for itself — for example in contact centers. It can run multiple workloads at the same time. A client told me, ‘People are getting drunk on Snowflake,’ ” Slootman concluded.

Goldman Sachs is bullish on Snowflake. According to my November 30 interview with Goldman Sachs managing director, Kash Rangan, “Snowflake is like a Ferrari — it is bigger, faster, and has more power. These days we are talking about analyzing so much more data — exabytes. Creating an architecture that runs in the cloud is essential for providing accurate analysis at scale.”

Snowflake creates significant business value for customers. “Snowflake can help you compare three suppliers and identify which one delivers on time and at the lowest price,” he told me.

Rangan added, “An investment bank can run daily analysis to find out which customer is the most profitable, which equity and fixed income trades made the most money. It is really hard to get those answers fast. If you have to wait a week, the information is not actionable.”

Goldman sees Generative AI as an opportunity for Snowflake in two areas: data and training large language models. Snowflake enables non technical users to ask a question in plain English to get access to all that data.

For example, Rangan said, “You can ask: ‘My supply chain is gnarled up. How can I configure it so I can deliver what customers ordered in three days rather than seven?’ ” He added, “Snowflake uses SQL. So before LLMs, only experts in SQL coding could tap into that valuable business data. Now anyone can access it.”

He thinks Snowflake has a bright future. “It has core technology, a great business model, an outstanding management team, and the ability to execute.”

Another analyst has a more mixed outlook. According to Morningstar — which set a price target of $231 for Snowflake — the analyst was disappointed with the company’s profitability; while its Q3 revenue and revenue forecast were better than anticipated.

Ultimately, Morningstar views Snowflake as well positioned to take advantage of the growth resulting from AI. However, the analyst is anticipating the investment required to satisfy the demand will cut into Snowflake’s profitability.

As long as Snowflake keeps reporting better than expected revenue growth, its stock price should rise.

Wed, 06 Dec 2023 02:25:00 -0600 Peter Cohan en text/html https://www.forbes.com/sites/petercohan/2023/12/06/snowflake-stock-rises-on-growth-from-generative-ai/
Snowflake Inc Ordinary Shares - Class A SNOW

Founded in 2012, Snowflake is a data lake, warehousing, and sharing company that came public in 2020. To date, the company has over 3,000 customers, including nearly 30% of the Fortune 500 as its customers. Snowflake’s data lake stores unstructured and semistructured data that can then be used in analytics to create insights stored in its data warehouse. Snowflake’s data sharing capability allows enterprises to easily buy and ingest data almost instantaneously compared with a traditionally months-long process. Overall, the company is known for the fact that all of its data solutions that can be hosted on various public clouds.

Wed, 27 Dec 2023 10:00:00 -0600 en text/html https://www.morningstar.com/stocks/xnys/snow/quote
Snowflake Ranked #1 on 2023 Fortune Future 50 List

Snowflake Ranked #1 on 2023 Fortune Future 50 List

Annual Fortune list recognizes Snowflake as the leading company with the capability to deliver sustained long-term growth in the era of generative AI

Snowflake (NYSE: SNOW), the Data Cloud company, today announced that it was awarded the number one spot on the 2023 Fortune Future 50 list. The annual list from Fortune recognizes leading publicly traded companies based on a market-based assessment of a company’s growth potential, and its capacity to deliver against that potential. Among the focuses of this year’s list includes companies driving the future with generative AI.

Snowflake has transformed the data industry with Snowflake Data Cloud, a global network that connects organizations to the data and applications most critical to their business. The Data Cloud enables a wide range of possibilities, from creating a real-time data backplane for the modern enterprise, collaborating over content with partners and customers, integrating external data and applications for fresh insights, to simplifying the complexity of generative AI - all from Snowflake’s single, cross-cloud platform. This accolade from Fortune is a testament to the incredible opportunity ahead as Snowflake empowers customers across industries to more easily mobilize data and AI for business value.

“In this new AI era, generative AI and large language models will reshape how we live, work and do business. But there is no AI strategy without a data strategy,” said Sridhar Ramaswamy, SVP of AI at Snowflake. “We are honored to be named at the top of Fortune’s Future 50 list and look forward to continuing to help customers solve their biggest problems and deliver real value using AI.”

To identify the Future 50, the BCG Henderson Institute examined more than 1,000 publicly traded companies with at least $20 billion in market value or $10 billion in revenue in the 12 months through the end of 2022. Thirty percent of a company’s score is based on market potential—defined as its expected future growth as determined by financial markets. The remaining 70% is based on a company’s capacity to deliver against this potential, which comprises 19 factors, selected for their ability to predict growth over the following five years, which fall into four categories: strategy, technology and investments, people, and structure.

Learn More

  • Join Snowflake’s annual developer conference, BUILD, December 5-6. Register for the virtual event here.
  • Read how Snowflake Cortex is providing customers with fast, easy, and secure LLM-powered app development in this blog post.
  • Stay on top of the latest news and announcements from Snowflake on LinkedIn and Twitter.

About Snowflake

Snowflake enables every organization to mobilize their data with Snowflake’s Data Cloud. Customers use the Data Cloud to unite siloed data, discover and securely share data, power data applications, and execute diverse AI/ML and analytic workloads. Wherever data or users live, Snowflake delivers a single data experience that spans multiple clouds and geographies. Thousands of customers across many industries, including 647 of the 2023 Forbes Global 2000 (G2K) as of October 31, 2023, use Snowflake Data Cloud to power their businesses. Learn more at snowflake.com.

Media Contact
Danica Stanczak
Global Corporate Communications Lead, Snowflake
press@snowflake.com

View source version on businesswire.com: https://www.businesswire.com/news/home/20231205878318/en/

Tue, 05 Dec 2023 02:57:00 -0600 en text/html https://www.morningstar.com/news/business-wire/20231205878318/snowflake-ranked-1-on-2023-fortune-future-50-list
What snowflakes tell us about our Universe

What snowflakes tell us about our Universe

PROF BRIAN COX: Snowflakes are intricate, beautiful, mysterious, and totally captivating. But for all their complexity - and endless variety - the structure of a snowflake can be explained by a few universal laws of nature. Laws that explain everything from snowflakes to galaxies. Let’s start at the beginning. What is a snowflake? Or, to use its more technical name, a snow crystal? A snow crystal forms up in the clouds when water vapour meets little specks of dust or pollen. This forms its tiny hexagonal heart. The tips stick out and are rough. This attracts water molecules. And then more water molecules. And more. These form the branches of our snowflake. The size and shape of these branches depends on the exact temperature and humidity that the snowflake meets on its journey through the clouds, pulled down by the force of gravity. Each one takes a very slightly different route - meaning no two snowflakes are quite the same. When a snowflake lands on your sleeve, it has been on its own, totally unique, journey to reach you. Before melting away in a moment. Way back in 1611, on a bitterly cold January morning in Prague, a snowflake landed on the sleeve of mathematician Johannes Kepler. And it got him thinking: “Why do snowflakes have six sides?” Kepler’s breakthrough was his theory that this hexagonal pattern is the most efficient use of space. Whether it’s a honeycomb within a beehive. Or piles of stacked cannonballs. Or a delicate, transient, snowflake. It took 400 years - 400 years - for his theory to be proven. What Kepler didn’t know at the time is that each molecule of water, or H2O, is made up of two hydrogen atoms and one oxygen atom. As the water molecules cluster together when they freeze, the angle between the hydrogen atoms is always, approximately, 105 degrees. And that gives us the six sides. At its heart, a snowflake is always a hexagon. But it can grow into all sorts of weird and wonderful shapes. Long and thin, like a pencil. Sharp like a needle. Cylindrical like a bullet. Or, just occasionally, triangular. The truth is though, most snowflakes are kind of… well, blob-like. If you speak to a snowflake photographer - there are just a handful in the world - they’ll tell you it takes days and days out in the cold to get that “money shot”. And the conditions have to be just right – between minus 15 and minus 13 degrees. But ever since Wilson Bentley, a farmer from the US state of Vermont, painstakingly took the first photos of stunning snowflakes in 1885, we’ve been hooked. Scientists have shown that symmetry is incredibly pleasing to the human brain. Snowflakes are all radially symmetrical, which means you can cut them into identical slices, like a cake. Shells, flowers, starfish, even spiral galaxies, like the Milky Way, share this type of symmetry. And nature has one last trick up its sleeve. Snowflakes aren’t actually white. They’re clear, but they have lots of edges, and this scatters the light, making them appear white. Each snowflake is a microcosm of the laws of physics. Gravity makes it fall. Electromagnetism dictates its shape. And you’ve got symmetry. It’s the same with the stars, and solar systems, and planets. And with us. When you look at a snowflake, you can read its history. Its own unique story. The experiences it encounters shape it into what it is. Just like us, really.


Fri, 22 Dec 2023 10:00:00 -0600 en text/html https://www.bbc.co.uk/ideas/videos/what-snowflakes-tell-us-about-our-universe/p0h19v6r
AWS And Snowflake: ‘From True Competitors, To Frenemies To…An Alliance’

‘In the old days, I think we used to sit and fester on it and sort of ‘poor Snowflake,’ and it was sort of the David and Goliath story,’ says Colleen Kapase, Snowflake’s senior vice president of worldwide partner and alliances. ‘We’ve reached across the trenches, (and) we’ve created those relationships.’

ARTICLE TITLE HERE

Amazon Web Services’ relationship with data cloud provider Snowflake has followed a dynamic arc that’s seen the two companies evolve from competitors to frenemies to more strategic allies serving joint customers amid a spirit of coopetition.

The success of that partnership is illustrated by their joint co-selling goals that have more than doubled year over year since 2020, according to both companies.

“We continue to blow our metrics out of the water,” Sabina Joseph, AWS’ general manager of technology partners, told CRN.

The seven-year relationship hasn’t always been a cakewalk for AWS, Snowflake or their joint partners. Snowflake started as an AWS customer and competitor. While today it also runs on Microsoft Azure and Google Cloud Platform, Snowflake was born on AWS and a “substantial majority” of its business operates on AWS’ public cloud infrastructure, according to regulatory filings. In mid-2020, prior to its $3.36 billion initial public offering, Snowflake disclosed it had committed to spending $1.2 billion through July 2025 on AWS cloud infrastructure services. As a competitor, meanwhile, Snowflake competes directly against Amazon Redshift, AWS’ data warehouse service that launched in 2013.

Deloitte Consulting had been a longstanding global systems integrator (GSI) partner with AWS before becoming a partner with Snowflake in early 2020.

“It’s always tricky when you’ve got an established, very strategic relationship and then a new player comes in,” said Frank Farrall, the cloud analytics and AI ecosystems leader at Deloitte who also heads its Snowflake alliance. “We knew…they already sit on top of AWS in the majority of situations, however, they compete with one of AWS products, so there is this coopetition-type thing that we’re really going to have to navigate. We were actually really worried. We took a…kind of risk mitigation orientation…early in the relationship, and there have been times of tension between AWS and Snowflake that we have had to navigate a little bit and try to read the tea leaves around how that’s playing out.”

Back in 2015, AWS and Snowflake’s relationship was overwhelmingly competitive.

“The two sales teams in the field would do just that: They would compete,” said Colleen Kapase, Snowflake’s senior vice president of worldwide partner and alliances. “If Snowflake did win, sort of we were tolerated. That’s completely evolved and changed over time, and I do credit AWS’ customer-first mentality for helping to make this space for that change.”

AWS and Snowflake had about 30 joint customer deals prior to 2019. Their relationship began to shift when they started building out a joint strategy together and focused on joint customer use cases, particularly around migrating large on-premises customers and their data-oriented workloads onto AWS using Snowflake, according to Joseph.

“We started to realize this is not about perceived coopetition that exists, this is really about providing differentiated, best-of-a-kind solutions for our mutual customers,” Joseph said.

Snowflake deeply invested in technical integrations with AWS that were core to a successful go-to-market partnership, Joseph said, including integrations with AWS Lambda, a serverless, event-driven compute service, and AWS PrivateLink, which allows customers to privately access AWS services without using public IPs.

“Those product integrations were key,” Kapase said. “PrivateLink was key to Snowflake selling into the enterprise segment — that was required for a lot of our enterprise customers.”

Snowflake now has more than 20 horizontal product integrations with AWS – the most among its cloud provider partners. Snowflake was a featured partner for AWS AI for data analytics (AIDA) solutions unveiled at AWS re:Invent 2021 last November. It has deepened its integrations with Amazon SageMaker, AWS’ flagship machine learning (ML) service, and was the only partner included in last November’s launch of SageMaker Canvas, a visual, no-code, ML capability that allows business analysts to generate predictions without ML expertise.

The Evolution Of The Relationship

While AWS and Snowflake’s partnership started on the product side, it progressed to sales compensation and then to specific use cases in vertical industries. Beyond the breadth and depth of its AWS integrations, Snowflake has become one of AWS’ largest partners in terms of its go-to-market engagement and number of joint customers — from healthcare company Anthem to global shipping and mailing company Pitney Bowes to financial services provider Western Union.

AWS and Snowflake signed a strategic collaboration agreement in mid-2020, with both companies increasing their investments in partner sales, marketing and alliance teams globally.

“We also have an annual partner plan from a technical and business perspective, which guides us to the set of initiatives…that we’re going to do in a (geographic region), how are we going to work with SIs, GSIs, what kind of industry vertical use cases we want to focus on, and then also what kind of goals — actual hard goals — that we jointly want to achieve,” Joseph said.

In December 2020, AWS unveiled an incentive program called AWS ISV Accelerate that compensates its sales teams for selling independent software vendor partners’ SaaS solutions — such as the Snowflake Data Cloud — that are integrated with AWS. That helped align AWS’ and Snowflakes’ sales teams.

“The change in sales comp went from we were tolerated, to all of a sudden the sales team on the AWS side was saying, ‘OK, I see this unique use case for Snowflake,’” Kapase said. “We started seeing that the co-sell between the two companies — the joint registrations of our field bringing AWS in, AWS bringing us in — just ignited. Sabina and I had to sit down in Q4 and say, ‘We’re going to blow out our co-sell goal together. We have to up our goal for next year,’ because it was like wildfire. It wasn’t just in the U.S. We made some bets in Europe and in Asia as well, and we still had a 2X increase there in terms of customer engagements and customer wins.”

AWS and Snowflake last year also began focusing their partnership on industry verticals such as media and advertising and financial services, and doubling down on working with GSIs and other ISV partners.

The two companies have gone through the “curve of true competitors, to frenemies to truly being an alliance,” Kapase said.

“We have a similar DNA between us and AWS,” she said. “Snowflake is 100 percent consumption-driven. That’s the only numbers we report out to the street. A lot of that customer-first mentality is not just about buying the products, it’s about using the products. And what we saw as we’ve been working together is that consumption move faster…and we have happier customers jointly across each other.”

That’s not to say everything has run smoothly.

“Like with any field organization, there are always challenges,” Kapase said. “In the old days, I think we used to sit and fester on it and sort of ‘poor Snowflake,’ and it was sort of the David and Goliath story. We’ve reached across the trenches, (and) we’ve created those relationships. Myself and quite a few of the executives now at Snowflake have the bat phone directly into Sabina, to Carol Potts, who runs the (AWS) ISV sales team. We had to bring those issues out, and what we saw was a really fast execution to resolve those field issues. You’d like to say it’s more complicated than that, but the reality is old-fashioned communication and sharing information and working through issues helps strengthen the relationship.”

It comes down to transparency, according to Joseph.

“I’m a firm believer that in any partnership, you’ve got to be transparent and say it the way it is and state the facts, and then that would bubble up to the right outcome for the end customer,” she said. “That’s how I believe that we have operated, especially the last two to three years, which has enabled us to really deepen this partnership.”

How AWS And Snowflake Partners View The Relationship

AWS and Snowflake are both “best of breed” and better together, said Hilary Feier, general manager for global data and analytics at Seattle consulting firm Slalom. AWS’ 2021 National System Integrator Partner of the Year and Snowflake’s four-time Partner of the Year, Slalom has implemented the AWS/Snowflake integration for its own internal data platform.

“They’re just a winning combination,” Feier said. “Snowflake brings a bit of that ‘easy button’ to cloud data platforms where you can T-shirt-size it. You can basically have people with SQL skills take what they brought with them. And AWS has this incredibly extensive, comprehensive, kind of ‘Legoland’ of incredible parts that you can really build whatever you want. That combination together works really, really well to service our clients.”

It’s been about 2.5 years since Slalom experienced occasional pressure to favor Amazon Redshift over Snowflake from AWS services teams incentivized to sell the cloud provider’s own product, according to Feier.

“We have some really direct and frankly good partnering conversations on…the strengths of the different platforms, and they were really open,” she said. “It was…very productive, because we have a really good relationship on both sides.”

AWS eventually realized that it benefitted significantly when Snowflake won on its platform, Deloitte’s Farrall said. If a client picks Snowflake and has AWS as the underlying hyper-scaler provider, AWS gets the consumption benefit of Snowflake and the data consumption that’s driven by the customer, he said.

“What we have seen over the last probably 18 months is almost like a radical reorientation of AWS’ thinking related to us and Snowflake,” Farrall said. “We now have senior account executives from AWS that will call us up and say, ‘Hey, we think that the right answer for this particular client that we’re all looking at…is Snowflake, AWS and Deloitte. Can we partner together? Can we work together on the architecture? Can we build a business case together for that client?’ At first, we were really quite stunned when we started to get this type of a reach-out.”

That outreach has become common, particularly in the six months to a year, according to Farrall, and Deloitte, AWS and Snowflake now have a joint go-to-market motion with a dozen prioritized customers where that architecture exists.

“We’re talking together about how do we best maximize the result for the customer,” Farrall said. “You hear the word ecosystem thrown out there quite a lot. What we’re seeing with the dynamic with AWS and Snowflake is a real ecosystem play, and we’re very positive, very excited about that.”

Tue, 22 Feb 2022 08:49:00 -0600 text/html https://www.crn.com/news/cloud/aws-and-snowflake-from-true-competitors-to-frenemies-to-an-alliance




DEA-C01 pdf | DEA-C01 questions | DEA-C01 helper | DEA-C01 test prep | DEA-C01 questions | DEA-C01 test syllabus | DEA-C01 pdf | DEA-C01 tricks | DEA-C01 test prep | DEA-C01 approach |


Killexams test Simulator
Killexams Questions and Answers
Killexams Exams List
Search Exams
DEA-C01 exam dump and training guide direct download
Training Exams List