Wednesday, 31 January 2024

Generative AI framework for Government

The UK government has released their generative AI framework created by the Central Digital and Data Office V1.0.  This is public sector guidance with a focus on Large Language Models (LLMs). 

The framework outlines ten principles:

Principle 1: You know what generative AI is and what its limitations are

Principle 2: You use generative AI lawfully, ethically and responsibly

Principle 3: You know how to keep generative AI tools secure

Principle 4: You have meaningful human control at the right stage

Principle 5: You understand how to manage the full generative AI lifecycle

Principle 6: You use the right tool for the job

Principle 7: You are open and collaborative

Principle 8: You work with commercial colleagues from the start

Principle 9: You have the skills and expertise that you need to build and use generative AI

Principle 10: You use these principles alongside your organisation’s policies and have the right assurance in place

It defines Generative AI as a form of AI '– a broad field which aims to use computers to emulate the products of human intelligence or to build capabilities which go beyond human intelligence'

Then within Generative AI  how those public LLM's fit within the field

The framework has lots of information to draw on from  advocating for lawful, ethical, and responsible usage to addressing the the challenges of accuracy, bias and environmental impact.  Transparency and human control are paramount going forward. You can read more here.

Tuesday, 23 January 2024

Purview in Microsoft Fabric

I am pleased to be speaking at Data Toboggan Winter Edition 2024 on Saturday 3rd February.
Register now for free: https://bit.ly/DT24-Register

My Abstract

Microsoft Fabric comes with Purview for data governance. What does that mean and how can it help with managing your data estate. This session looks to connect the dots between the old and new and explains, which of the apps exist in Fabric.


 

Wednesday, 10 January 2024

AI Builder Prompting Guide

Microsoft have released the AI Builder Prompting Guide, This is another useful tool to help with Prompt engineering.

The guide explains 'Prompts are how you build custom generative  AI capabilities in Power Platform, like summarizing a body of text, drafting a response, or categorizing an incoming email. Think of prompts as a way of building custom GPT functions using only natural language'

This comprehensive guide details the art of prompt engineering, providing insights on how to create effective prompts to maximize AI capabilities. 

There are 6 common uses for prompts: 
  • Classification of text
  • Sentiment analysis
  • Rewriting content
  • Summarize information
  • Extract information
  • Drafting a response
From summarization to sentiment analysis, this is guide helps explain prompts and strategies.  It also talks about Chain-of-thought prompting and how to lead AI step by step in reasoning processes. 

What to do when writing prompts:




The guide covers the important discussions that need to be had relating to the importance of responsible AI. 

Sunday, 7 January 2024

Microsoft Fabric Adoption Roadmap

The Microsoft Fabric adoption roadmap is a series of articles to help strategically with actions and considerations. This roadmap looks at more that just technological side. It considers growing that data culture across people, process and technology.  It covers these areas:

  • Data Culture
  • Executive sponsorship
  • Business Alignment
  • Content ownership and management
  • Content delivery scope
  • Centre of Excellence
  • Governance
  • Mentoring and user enablement
  • Community of practice
  • User Support
  • System oversight
  • Change management

Data culture in an organisation is critical to success and will greatly influence the outcomes. Ownership and governance guide how day to day activity takes place. Above all any adoption requires executive sponsorship to drive business success. 

Read more about the adoption roadmap
hashtag



Wednesday, 3 January 2024

Copilot Different Experiences

Adopt, extend and build Copilot experiences across the Microsoft Cloud.  Copilot uses Large Language Models (LLMs) . There are various level of expertise and effort required for each type. As with each type they connect to different data sources and require different level of customisations. 

A few terms

Temperature - This controls how conservative or adventurous Copilot should be. The higher the temperature, the more adventurous Token – when text is processed in large AI model, it is divided into the smallest units, these are called tokens. So tokens are the finest granularity in natural language progressing for these large language models

Prompt - The text you send to Azure OpenAI Service as an API call. This text is then input into the AI model. Metaprompts – part of the orchestration layer. A set of instructions that are passed to copilot that are sent to the model on every turn of conversation and where safety tuning happens.