You who read my blog are probably used to how the “Interesting Stuff” posts are published on a Sunday. This week, however, I am a bit late. I have been busy with other things (travelling) and have not had time to write this post until now.
Anyway, some of the “stuff” I cover this week is Microsoft Fabric, some cool AI/ML and LLM (Large Language Model) things.
Finally, there is a very interesting article on the real-time streaming ecosystem by Hubert Dulay.
Microsoft Fabric
- Microsoft Fabric introduction video. As the title says, this is an introduction video to Microsoft Fabric. It is a good introduction to the concepts of Microsoft Fabric.
AI/ML
- How to Predict Player Churn, with Some Help From ChatGPT. This blog post uses low-code to analyze and visualize player churn rate in the gaming industry. It explores the importance of understanding player churn and provides an overview of the dataset used in the analysis. The blog post uses one low-code machine learning platform to train a model capable of predicting if a user will stop playing a game. Additionally, it delves into results interpretation and techniques that can be used to improve the model’s performance.
- Introducing Azure OpenAI Service On Your Data in Public Preview. This is HUUUGE! This blog post announces the ability to run OpenAI models directly on your data without the need for training or fine-tuning the models. As I said, this is huge, and I am looking forward to seeing what people will do with this.
- All You Need to Know to Build Your First LLM App. This post provides guidance on developing an LLM (Large Language Model) app. It covers the fundamental concepts of language models and their applications. The author discusses the importance of pre-trained models, explores various Python libraries and frameworks that can be utilized, and outlines the steps involved in building the app. The article aims to equip readers with the necessary knowledge to create large language model applications.
Streaming
- Real-Time Streaming Ecosystem Part 5. This blog post is part of a multi-series post by Hubert on the real-time streaming ecosystem. In the post, he covers how to serve analytical data to end users. He covers RTOLAPs (Real-Time Online Analytical Processing) which are the last mile needed to get real-time data to their consumers. They provide fast and low-latency querying of data for decision-making and business intelligence purposes.
~ Finally
That’s all for this week. I hope you enjoy what I did put together. Please comment on this post or ping me if you have ideas for what to cover.
comments powered by Disqus