Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more
Snowflake has thousands of enterprise customers who use the company’s data and AI technologies. Though many issues with generative AI are solved, there is still lots of room for improvement.
Two such issues are text-to-SQL query and AI inference. SQL is the query language used for databases and it has been around in various forms for over 50 years. Existing large language models (LLMs) have text-to-SQL capabilities that can help users to write SQL queries. Vendors including Google have introduced advanced natural language SQL capabilities. Inference is also a mature capability with common technologies including Nvidia’s TensorRT being widely deployed.
While enterprises have widely deployed both technologies, they still face unresolved issues that demand solutions. Existing text-to-SQL capabilities in LLMs can generate plausible-looking queries, however they often break when executed against real enterprise databases. When it comes to inference, speed and cost efficiency are always areas where every enterprise is looking to do better.






