Semantic Kernel:Kernel内核

文摘   2024-06-12 07:20   日本  

目前SK已支持OpenAI,Azure OpenAI,Gemini,HuggingFace,MistralAI等LLM,相信之后会越来越丰富。

首先要引入所对应的LLM包,具体项目文件如下:

<Project Sdk="Microsoft.NET.Sdk">  <PropertyGroup>    <OutputType>Exe</OutputType>    <TargetFramework>net9.0</TargetFramework>    <RootNamespace>Demo01_Kernel</RootNamespace>    <ImplicitUsings>enable</ImplicitUsings>    <Nullable>enable</Nullable>  </PropertyGroup>  <ItemGroup>    <PackageReference Include="Microsoft.Extensions.Logging" Version="9.0.0-preview.4.24266.19" />    <PackageReference Include="Microsoft.Extensions.Logging.Console" Version="9.0.0-preview.4.24266.19" />    <PackageReference Include="Microsoft.SemanticKernel" Version="1.14.1" />    <PackageReference Include="Microsoft.SemanticKernel.Connectors.Google" Version="1.14.1-alpha" />    <PackageReference Include="Microsoft.SemanticKernel.Connectors.HuggingFace" Version="1.14.1-preview" />    <PackageReference Include="Microsoft.SemanticKernel.Connectors.MistralAI" Version="1.14.1-alpha" />  </ItemGroup></Project>

SK作为装LLM的SDK主要是通过Kernel展开的这点asp.net core类似,首先通过Kernel来创建一个Builder,然后给Bulder的Services添加各种服务,最后Build获取到Kernel,然后执行使用各种功能,实现操作。

using Microsoft.Extensions.Configuration;using Microsoft.Extensions.DependencyInjection;using Microsoft.Extensions.Logging;using Microsoft.Extensions.Logging.Console;using Microsoft.Extensions.Options;using Microsoft.SemanticKernel;using Microsoft.SemanticKernel.ChatCompletion;using Microsoft.SemanticKernel.Connectors.OpenAI;using Microsoft.SemanticKernel.Services;using System.Diagnostics.CodeAnalysis;using System.Net.NetworkInformation;
var chatModelId = "gpt-4o";var key = File.ReadAllText(@"C:\GPT\key.txt");var endpoint = "";#pragma warning disable SKEXP0010#pragma warning disable SKEXP0070var builder = Kernel.CreateBuilder() .AddOpenAIChatCompletion(chatModelId, key);//.AddAzureOpenAIChatCompletion(chatModelId, endpoint, key) //.AddGoogleAIGeminiChatCompletion(chatModelId, key) //.AddHuggingFaceChatCompletion(chatModelId, apiKey: key) //.AddMistralChatCompletion(chatModelId, key) //builder.Services.AddSingleton<IAIServiceSelector>(new GptAIServiceSelector());//添加ServiceSelectorbuilder.Services.AddScoped<IAIServiceSelector, MyAIServiceSelector >();//添回复自定义服务builder.Services.AddScoped<IMyService, MyService>();//添加日志服务builder.Services.AddLogging(c => c.AddConsole()//.AddJsonConsole().SetMinimumLevel(LogLevel.Information));Kernel kernel = builder.Build();
var logger = kernel.LoggerFactory.CreateLogger("logger");var prompt = "你好,你能帮我做什么";var result = await kernel.InvokePromptAsync(prompt);var message = @$"返回信息:{result.GetValue<string>()}";logger.LogInformation(message);

class MyAIServiceSelector : IAIServiceSelector{ private readonly IMyService _myService; private readonly ILogger<GptAIServiceSelector> _logger; public GptAIServiceSelector(IMyService myService, ILogger<GptAIServiceSelector> logger) { _myService = myService; _logger = logger; } public bool TrySelectAIService<T>( Kernel kernel, KernelFunction function, KernelArguments arguments, [NotNullWhen(true)] out T? service, out PromptExecutionSettings? serviceSettings) where T : class, IAIService{ _myService.Print(); foreach (var serviceToCheck in kernel.GetAllServices<T>()) { var serviceModelId = serviceToCheck.GetModelId(); var endpoint = serviceToCheck.GetEndpoint(); if (!string.IsNullOrEmpty(serviceModelId)) { _logger.LogInformation($"使用的模型: {serviceModelId} {endpoint}"); _logger.LogInformation($"服务类型: {serviceToCheck.GetType().Name}"); service = serviceToCheck; serviceSettings = new OpenAIPromptExecutionSettings(); return true; } } service = null; serviceSettings = null; return false; }}//添加自定义服务interface IMyService{ void Print();}class MyService : IMyService{ readonly ILogger<MyService> _logger; public MyService(ILogger<MyService> logger) { _logger = logger; logger.LogInformation("MyService实例化"); } public void Print() { _logger.LogWarning("开始报警"); }}


桂迹
分享原创,记录痕迹!
 最新文章