As you know, one of the most basic features that nearly every backend application must have is logging of requests and responses. Especially in environments where multi API usage is concerned, it is a basic feature that switches from “nice-to-have” state to “must-have” state.
Because, there may be a time, that we need to be able to show that the incoming request, that is sent by the client to our API, has invalid parameters.
In short, it is a feature that we should bring to our API to be able to get a better “traceability“.
Within a short period of time, we needed to log all API requests and responses. Yes, there are many ways to do this. For example, we can log requests and responses by developing a middleware in each APIs. Of course, they all have different tradeoffs.
Let’s say we have written a middleware. Well, who will implement it to all APIs? Will it have a bad impact on performance? Of course, one of the important points is time.
In this article, I will try to mention how I added requests and responses logging feature in our existing system as a “quick-win” solution.
For a clearer understanding of the solution, I want to give a piece of brief information about our system. The project that I’m working on is completely designed as a cloud-native and microservice approach. We orchestrated all our applications on Azure Kubernetes Service and also we are managing all our APIs with Azure API Management.
Azure API Management is one of the key parts on our distributed system. In context of this article, I won’t go into the details of the benefits of using the API Gateway in microservice architecture.
However, I would like to mention one of the points that the use of API Gateway gives me a quick-win solution.
API Gateway allows us to create a “single entry point” and uniform experience.
Hence, besides the routing process, it also allows us to perform manipulations on requests and responses.
In this case, why would I perform request and response logging operation in each APIs? Right?
Azure API Management has a logging policy like other API Gateways. Using these policies, it is possible to perform a platform-independent and centralized logging operation.
At this point, we will follow a few steps:
We have an advanced policy named “Log to EventHub” on Azure API Management. Using this policy, we can send logs to Event Hub with minimum performance impact (we should consider performance impact too) and then we can process/save.
If you are not currently using an Event Hub, you need an Event Hub namespace and instance.
Azure Event Hub is an event ingestion service that is capable of consuming and processing dozens of events per second.
You can follow the steps here to create it easily.
After completing the Event Hub creation, we need a shared access policy in order to use Event Hub in Azure API Management. To get this policy, let’s click the Event Hub entity you have just created and enter the details.
Then click on the “Shared access policies” tab on the left side as above.
Now let’s click the “Add” button to create a policy. I have created a policy named “AzureAPIManagement“. After the creation operation, we will be able to access the connection string information of the Event Hub instance as saw above.
We have completed the Event Hub steps. Now we need a logger inside the Azure API Management. We will create a logger through the Azure API Management REST API.
Before starting to logger creation operation, we need to enable REST API service on Azure API Management. In order to enable this service, we need to check the “Enable API Management REST API” checkbox in the Azure API Management instance by entering the “Management API” tab as follows.
After enabling the REST API, let’s get an access token by clicking the “Generate” button in the “Access token” section below.
Now we can create a logger. In order to create a logger, we need to send a HTTP PUT request to the following URL.
https://my-api-management. management.azure-api.net/loggers/my-logger?api-version=2018-01-01
NOTE: Don’t forget to change {my-api-management} and {my-logger} parameters with yours service names.
There are a few parameters that we should add to the header.
and the payload should be:
{ "loggerType" : "AzureEventHub", "description" : "My event hub logger", "credentials" : { "name" : "my-log-hub", "connectionString" : "Endpoint=my-log-hub sender connection string" } }
Event Hub logger is ready to use now. In here let’s take note the logger name which is the “my-logger“. We will use it later to define the logger policy.
To define the logger policy, let’s enter the related Azure API Management instance and then click to the “APIs” tab in the left menu. After that, let’s go to the “All APIs” section as follows.
What we want to do here is to add an Event Hub logger that will apply to all our APIs. To be able to do that we will add a “Log to EventHub” policy to the “inbound” and “outbound” processing sections.
To add the policy, let’s click the icon next to the “Policies” title on the right and go to the code view screen.
Now add the “Log to EventHub” policy, which is placed under the “Advanced Policies” section, between “inbound” and “outbound” tags and edit it as follows.
<policies> <inbound> <set-variable name="message-id" value="@(Guid.NewGuid())" /> <log-to-eventhub logger-id="my-logger" partition-id="1">@{ var requestLine = string.Format("{0} {1} HTTP/1.1\r\n", context.Request.Method, context.Request.Url.Path + context.Request.Url.QueryString); var body = context.Request.Body?.As<string>(true); if (body != null && body.Length > 1024) { body = body.Substring(0, 1024); } var headers = context.Request.Headers .Where(h => h.Key != "Authorization" && h.Key != "Ocp-Apim-Subscription-Key") .Select(h => string.Format("{0}: {1}", h.Key, String.Join(", ", h.Value))) .ToArray<string>(); var headerString = (headers.Any()) ? string.Join("\r\n", headers) + "\r\n" : string.Empty; return "request:" + context.Variables["message-id"] + "\n" + requestLine + headerString + "\r\n" + body; }</log-to-eventhub> </inbound> <backend> <forward-request /> </backend> <outbound> <log-to-eventhub logger-id="my-logger" partition-id="1">@{ var statusLine = string.Format("HTTP/1.1 {0} {1}\r\n", context.Response.StatusCode, context.Response.StatusReason); var body = context.Response.Body?.As<string>(true); if (body != null && body.Length > 1024) { body = body.Substring(0, 1024); } var headers = context.Response.Headers .Select(h => string.Format("{0}: {1}", h.Key, String.Join(", ", h.Value))) .ToArray<string>(); var headerString = (headers.Any()) ? string.Join("\r\n", headers) + "\r\n" : string.Empty; return "response:" + context.Variables["message-id"] + "\n" + statusLine + headerString + "\r\n" + body; }</log-to-eventhub> </outbound> <on-error /> </policies>
In here the point that we should consider is the “logger-id” parameter. We set the logger name, that we specified when creating the Event Hub logger, as a value for this parameter. In addition, we are correlating requests and responses with the “message-id” variable and send them to the Event Hub.
Btw, it’s nice to be able to write C# based policy in Azure API Management, right?
Now we are keeping the request and response logs on the Event Hub. There are a few ways to consume these logs on the Event Hub. At this point, I chose the Azure Functions, that is serverless compute, to avoid any extra effort.
First, let’s create an Azure Function with following this documentation steps.
When you complete the steps up to “Create an HTTP triggered function” section in the documentation, let’s select the “All” option by clicking the “Change template filter” option. Then, select the “EventHubTrigger” option like above. After that, let’s give a name to the function (I named RequestLoggingFunction) and follow the rest of the steps to complete the function creation process.
Now we have an Azure Function template.
There are two operations that we will do here.
Since we are already using GrayLog, I have decided to integrate request and response message logs into the GrayLog. If you wish, you can take a look at different solutions like Moesif.
If you want to get a bit of information about the installation of GrayLog, you can check this article.
First, let’s add the Gelf package of the NLog to the project via NuGet.
dotnet add package NLog.Web.AspNetCore.Targets.Gelf
Then, let’s create a new folder called “Models” and define the following three classes.
public class HTTPRequestMessageLog { public string Method { get; set; } public Guid MessageId { get; set; } public string UrlPath { get; set; } public string Content { get; set; } }
public class HTTPResponseMessageLog { public Guid MessageId { get; set; } public string StatusCode { get; set; } public string Content { get; set; } }
NOTE: You can change the message log structure as what you wish.
public class HttpMessage { public Guid MessageId { get; set; } public bool IsRequest { get; set; } public HttpRequestMessage HttpRequestMessage { get; set; } public HttpResponseMessage HttpResponseMessage { get; set; } public static HttpMessage Parse(Stream stream) { using (var sr = new StreamReader(stream)) { return Parse(sr.ReadToEnd()); } } public static HttpMessage Parse(string data) { var httpMessage = new HttpMessage(); HttpContent content; using (var sr = new StringReader(data)) { // First line of data is (request|response) followed by a GUID to link request to response // Rest of data is in message/http format var firstLine = sr.ReadLine().Split(':'); if (firstLine.Length < 2) { throw new ArgumentException("Invalid formatted event :" + data); } httpMessage.IsRequest = firstLine[0] == "request"; httpMessage.MessageId = Guid.Parse(firstLine[1]); var stream = new MemoryStream(Encoding.UTF8.GetBytes(sr.ReadToEnd())); stream.Position = 0; content = new StreamContent(stream); } var contentType = new MediaTypeHeaderValue("application/http"); content.Headers.ContentType = contentType; if (httpMessage.IsRequest) { contentType.Parameters.Add(new NameValueHeaderValue("msgtype", "request")); // Using .Result isn't too evil because content is a locally buffered memory stream // Although if this were hosted in a System.Web based ASP.NET host it might block httpMessage.HttpRequestMessage = content.ReadAsHttpRequestMessageAsync().Result; } else { contentType.Parameters.Add(new NameValueHeaderValue("msgtype", "response")); httpMessage.HttpResponseMessage = content.ReadAsHttpResponseMessageAsync().Result; } return httpMessage; } }
We will parse the log data, that we will consume over the Event Hub, to “HttpRequestMessage” and “HttpResponseMessage” types through the “HttpMessage” class.
https://github.com/dgilling/ApimEventProcessor/blob/master/src/ApimEventProcessor/HttpMessage.cs
Let’s edit the class that comes with the function template as follows.
public static class RequestLoggingFunction { private static NLog.ILogger logger = null; [FunctionName ("RequestLoggingFunction")] public static async Task Run ([EventHubTrigger ("my-log-hub", Connection = "EVENTHUB_CONNECTION")] EventData[] events, ILogger log, ExecutionContext context) { ConfigureLogger (context.FunctionAppDirectory); var exceptions = new List<Exception> (); foreach (EventData eventData in events) { try { string messageBody = Encoding.UTF8.GetString (eventData.Body.Array, eventData.Body.Offset, eventData.Body.Count); var httpMessage = HttpMessage.Parse (messageBody); httpMessage. if (httpMessage.IsRequest) { var httpRequestMessageLog = new HTTPRequestMessageLog (); httpRequestMessageLog.MessageId = httpMessage.MessageId; httpRequestMessageLog.UrlPath = httpMessage.HttpRequestMessage.RequestUri.ToString (); httpRequestMessageLog.Method = httpMessage.HttpRequestMessage.Method.Method; httpRequestMessageLog.Content = await httpMessage.HttpRequestMessage.Content.ReadAsStringAsync (); logger.Info (JsonConvert.SerializeObject (httpRequestMessageLog)); } else { var httpResponseMessageLog = new HTTPResponseMessageLog (); httpResponseMessageLog.MessageId = httpMessage.MessageId; httpResponseMessageLog.StatusCode = httpMessage.HttpResponseMessage.StatusCode.ToString (); httpResponseMessageLog.Content = await httpMessage.HttpResponseMessage.Content.ReadAsStringAsync (); logger.Info (JsonConvert.SerializeObject (httpResponseMessageLog)); } await Task.Yield (); } catch (Exception e) { // We need to keep processing the rest of the batch - capture this exception and continue. // Also, consider capturing details of the message that failed processing so it can be processed again later. exceptions.Add (e); } } // Once processing of the batch is complete, if any messages in the batch failed processing throw an exception so that there is a record of the failure. if (exceptions.Count > 1) throw new AggregateException (exceptions); if (exceptions.Count == 1) throw exceptions.Single (); } private static void ConfigureLogger (string appDirectory) { if (logger == null) { string nlogConfigPath = Path.Combine (appDirectory, "nlog.config"); NLog.LogManager.Configuration = new XmlLoggingConfiguration (nlogConfigPath); logger = NLog.LogManager.GetCurrentClassLogger (); } } }
NOTE: Don’t forget to change the “my-log-hub” and “EVENTHUB_CONNECTION” keys that I set to the “EventHubTrigger” attribute as parameter.
So let’s take a look at what we did in the function.
We configured the logger of NLog in order to send a log to GrayLog. In the “Run” method, we performed a logging operation by parsing events according to the related event type.
Now we need to create “nlog.config” file under the root directory of the function as follow.
<?xml version="1.0" encoding="utf-8" ?> <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" autoReload="true" throwExceptions="false" internalLogLevel="Off"> <extensions> <add assembly="NLog.Web.AspNetCore"/> <add assembly="NLog.Web.AspNetCore.Targets.Gelf"/> </extensions> <targets> <target xsi:type="Gelf" name="graylog" endpoint="udp://localhost:12201" facility="APM.RequestLogging.Function" SendLastFormatParameter="true" /> </targets> <rules> <logger level="Info" writeTo="graylog" /> </rules> </nlog>
NOTE: Don’t forget to also edit the GrayLog “endpoint“.
Now we are ready to deploy. Well, in order to perform deployment operation quickly, first let’s follow these steps.
NOTE: When creating a function on Azure, I recommend, that you select the pricing tier carefully. I chose the “app service” plan instead of the “consumption” plan to keep my function, which will log all request and response messages, under the control.
After completing the deployment process, we need to add the Event Hub‘s connection string information to the function’s app settings section via the Azure Portal.
First, let’s enter the “Function Apps” tab on the portal and click on the function we have created. Then click the “Application settings” option by entering the “Platform features” section.
Since I have defined the “Connection” key in the function as “EVENTHUB_CONNECTION“, I have added the same name in the app settings.
So finally we are ready. If we haven’t made any mistakes, the function we created will start to run successfully. In order to know that the function runs successfully or not, you can check it by entering the “Monitor” tab in the above screen.
An example request log will appear in GrayLog as follows.
Although it may seem like a simple operation, logging of request and response messages is important against many situations. I have also tried to show how I have done it inside our existing system, without adding more overhead, using the services that Azure has provided us.
Happy clouding!
{:en}In today’s technological age, we typically build our application solutions on event-driven architecture in order…
{:tr} Makalenin ilk bölümünde, Software Supply Chain güvenliğinin öneminden ve containerized uygulamaların güvenlik risklerini azaltabilmek…
{:tr}Bildiğimiz gibi modern yazılım geliştirme ortamında containerization'ın benimsenmesi, uygulamaların oluşturulma ve dağıtılma şekillerini oldukça değiştirdi.…
{:tr}Bildiğimiz gibi bir ürün geliştirirken olabildiğince farklı cloud çözümlerinden faydalanmak, harcanacak zaman ve karmaşıklığın yanı…
{:tr}Bazen bazı senaryolar vardır karmaşıklığını veya eksi yanlarını bildiğimiz halde implemente etmekten kaçamadığımız veya implemente…
{:tr}Bildiğimiz gibi microservice architecture'ına adapte olmanın bir çok artı noktası olduğu gibi, maalesef getirdiği bazı…
View Comments
Teşekkürler hocam