Moving REST To GraphQL  With Asp.net Core & Entity Framework Core.

Moving REST To GraphQL With Asp.net Core & Entity Framework Core.

GraphQL is a query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of the data in your API, gives clients the power to ask for exactly what they need and nothing more, makes it easier to evolve APIs over time, and enables powerful developer tools.

source: GraphQl

GraphQL queries are used to query the GraphQL server for the data that the client needs. What is interesting about GraphQL is that clients can write custom made queries based on the individual client’s needs. This means that GraphQL enables the client to ask for exactly what they want using a query and also returns a response with only what was asked. This approach gives the client more power.

Benefits of GraphQL:

  • Good fit for complex systems and microservices: By integrating multiple systems behind its API, GraphQL unifies them and hides their complexity. The GraphQL server is then responsible for fetching the data from the existing systems and packaging it up in the GraphQL response format.
  • Fetch data in single call and avoid multiple round trips:GraphQl is less chatty than Rest and rest api’s required multiple round trips between client and resources to fetch the data and return back to client to render on calling apps.

GraphQL solves the roundtrip problem by allowing the client to create a single query which calls several related functions (or resolvers) on the server to construct a response with multiple resources – in a single request. This is a much more efficient method of data delivery requiring fewer resources than multiple roundtrips.

  • Avoid Over/Under data fetching problems:REST api responses are known for either containing too much data or not enough of it, it’s very hard to design an API flexible enough to fulfill every client’s precise data needs. GraphQL solves this efficiency problem by fetching the exact data in a single request.

Building a GraphQL Service in ASP.NET Core

  1. Installing GraphQL in .net core: Since GraphQL support is not provided within ASP.NET Core, you need a Nuget package. Below are the most commonly used nuget packages used in .net core.
1
2
3

2. Setting up graph types: GraphTypes is a Class which derives from the ObjectGraphType base class that implements IObjectGraphType. Now in the constructor you can declare fields for this graph type. 

using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using GraphQL;
using GraphQL.Types;
using WebApiWithGraphQl.Data.Entities;
using WebApiWithGraphQl.Repositories;

namespace WebApiWithGraphQl.GraphQ.Types
{
public class EmployeType : ObjectGraphType<Employee>
{
public EmployeType(EmployeeRepository employeeRepository)
{
Field(x => x.EmployeId);
Field(x => x.Name);
Field<EmployeeTypeEnumType>("EmploymentType", resolve: context => context.Source.EmployeType.ToString());

Field<ListGraphType<AddressType>>(

"Address",
resolve: context => employeeRepository.GetAdddressById(context.Source.EmployeId)
);
}
}
}

3. Define Resolver:Now that we have a Employe graph type we need another class that knows how to get Employees. I call it EmployeQuery which also derives from ObjectGraphType.
In the constructor I declare one field, this time I explicitly say that this field must return a list of EmployeType objects. As you can see even list is a special graph type.
Then I give the field a name and in a lambda I can now specify where the data should come from, in other words how to data should be resolved.

Resolvers are the functions responsible for supplying the data requested by the query and is the integration point between our application’s data source and the GraphQL infrastructure.

namespace WebApiWithGraphQl.GraphQ.Query
{
public class EmployeeQuery:ObjectGraphType
{
public EmployeeQuery(EmployeeRepository employeeRepository)
{
Field(
"Employees",
resolve: context => employeeRepository.GetAllEmployees()
);
}
}
}

2. Set up schema: A GraphQL schema is at the center of any GraphQL server implementation and describes the functionality available to the clients which connect to it.

GraphQL implements a human-readable schema syntax known as its Schema Definition Language, or “SDL”. The SDL is used to express the types available within a schema and how those types relate to each other. 

using GraphQL;
using GraphQL.Types;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using WebApiWithGraphQl.GraphQ.Query;

namespace WebApiWithGraphQl.GraphQl
{
public class EmployeSchema :Schema
{
public EmployeSchema(IDependencyResolver resolver ):base(resolver)
{
Query = resolver.Resolve();
}
}
}

Configuring Asp.net Core With GraphQL Middleware

To setup GraphQL in your project and start using the scheme we created go to the startup class of your application.

  • Add the dependency resolver to get the query instance.
  • AddGraphQL extension method to register all the types GraphQL .NET uses
  • AddGraphTypes which will scan the assembly for all ObjectGraphTypes and register them automatically in the container using the specified lifetime.
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext(options => options.UseSqlServer(Configuration.GetConnectionString("EmpoyeeDBConnectionString")));
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new Swashbuckle.AspNetCore.Swagger.Info { Title = "My API", Version = "v1" });
});
services.AddScoped(s => new FuncDependencyResolver(s.GetRequiredService));
services.AddScoped();
services.AddScoped();

services.AddGraphQL(o => { o.ExposeExceptions = false; })
.AddGraphTypes(ServiceLifetime.Scoped);
}

Now we have to inject graphQL middleware by calling UseGraphQL extension method in configure method of startup.cs class.

public void Configure(IApplicationBuilder app, IHostingEnvironment env,EmployeeContext context)
{
app.UseGraphQL();
app.UseGraphQLPlayground(new GraphQLPlaygroundOptions());
context.Seed();

app.UseHttpsRedirection();
app.UseMvc();
app.UseSwagger();
app.UseSwaggerUI(c =>
{
c.SwaggerEndpoint("/swagger/v1/swagger.json", "API with graphQl V1");
});

}

Till now we have setup all the mandatory steps for graphQL integration with asp.net core api.If you want to see the playground UI as soon as you start the API go to the properties of the project and select the debug tab. Activate launch browser there and type in ui/playground as the url.

5

When the browser opens take a look at the schema tab on the right side. The metadata of the schema has been read by the playground.
Using this metadata information the query editor can have intellisense.
Type the query in the pic. It gets all products but only the names and descriptions. When I execute it you can see that the result is JSON and the data is contained in the data root node which has a products array with the data I asked for.

6
PlaygroundQueryExecution

Advertisements
Containerised Asp.Net Core WebApi With Docker On Mac.

Containerised Asp.Net Core WebApi With Docker On Mac.

New .NET Core is the biggest change since the invention of .NET platform. It is fully open-source, components and is supported by Windows, Linux and Mac OSX. In this post I am going to give it a test ride by creating a containerised C# application with the latest .NET CORE.

Docker containers allow teams to build, test, replicate and run software, regardless of where the software is deployed.Docker containers assure teams that software will always act the same no matter where it is – there’s no inconsistency in behavior, which allows for more accurate and reliable testing.

The main advantage to using Docker containers is that they mimic running an application in a virtual machine, minus the hassle and resource neediness of a virtual machine. Available for Windows and Linux systems, Docker containers are lightweight and simple, taking up only a small amount of operating system capacity. They start-up in seconds and require little disk space or memory.

docker installation is available for Mac and windows and can be downloaded from office channel. clickhere

prerequisites:

  1. Install Visual studio for mac
  2. Install Docker for mac

here we will try to build asp.net core webapi and host/run on container with the help of docker.

1. Create New Project: choose asp.net core webapi template from visual studio 2017

D1.png

Now provide project name,solution name and other details like where to save the project.

D2.png

Now our newly created project structure looks as per below

D3.png

Here we have very simple case where service only returns some information about employees as our main objective to host this tiny application on container.

2.Add Docker Support To application

Now add docker file in project  and write instructions that how the docker image build from base image of asp.net core image.

 

Below are the instructions issue to daemon to create docker image.

 

3. Open Terminal on mac :

search for “Terminal” on mac machine and open new window.

D6.png

Once we click on new window option then command window will appear and now all set to issue/write docker command to create docker image.

D7.png

4. Navigate to application folder by issuing change directory command “CD” and make sure we are inside the application folder .We can verify that all items listed by issuing “LS” command that means we are in right place.

D8.png

5. Create Docker Image :

The docker build command builds Docker images from a Dockerfile and a “context”. A build’s context is the set of files located in the specified PATH or URL . The build process can refer to any of the files in the context

Command: Docker build -t .

here our image name is “firstapiwithdocker”,so command should be

docker build -t firstapiwithdocker .

at the moment we can see daemon accept the command start creating the docker image from the docker file instructions.

D9.png

at last we can see  image has been successfully created and tagged with “Latest” keyword.if we don’t provided any tag than daemon tagged the image with “Latest” keyword.

D10.png

5. List all Docker Images:

Now we have to verify that require image has been created or not,so below command have to issue list down all the images.we can see all the important information about images like image name,tag,imageid,created date and size. in below image we can see that out newly created image listed with other base images.

Command :  Docker Images

D11.png

6.Run image with in container:

Till now we have successfully created docker image for our webapi solution and contain all the necessary files and now we have to create container to run this image.

below command will create container.

Syntax: docker run -d -p : –name

example: docker run -d -p 9000:80 –name FirstContainer firstapiwithdocker

once we execute above command,a new container has been created and get random number that means container has been created successfully.

D12.png

Now List down all containers and we can see all the important metadata about containers like containerId,ImageName, Command,Created date,container status,Ports and container name.

here our newly created container is running and exposing 9000 from the host to 80 on the container.

D13.png

Let’s hit the url “http://localhost:9000/api/values” on browser or postman to verify that our application is running on container or not.

Below are the result of the webapi which is running on container instead of local machine.

D14

7. Push docker image on docker hub:

Docker Cloud uses Docker Hub as its native registry for storing both public and private repositories. Once you push your images to Docker Hub, they are available in Docker Cloud.

we need to create docker hub account to push the image on public/private repositories.

D15

 

Docker image should be tagged with well qualified name before issuing the push command. so below command will tagged the image with name “RakeshMahur/FirstApiWithContainer”.

Syntax: docker tag <ImageName> <TagName>

Example:  docker tag firstapiwithdocker rakeshmahur/webapicore-sample

Now login on docker hub from the terminal window by issuing the “docker login” command and provide the docker hub account details (username/password).

D16.png

once docker hub credentials has been validated successfully then a message comes on window and now we will able to push image to docker hub.

D17

Issue docker push command to push docker image to docker hub.

docker push rakeshmahur/webapicore-sample

once we execute above command then we can see our local docker image push to docker cloud repository and listed done over there and every one can pull this image start working on it.

D19.png

Docker Commands

Below are some important and comman used commands , refer to the docker documentation for more details and a more exhaustive list of flags.

  • docker build -t .
    • Builds an image from a given dockerfile. While still useful when handling individual images, ultimately docker-compose will build your project’s images.
  • docker exec -it
    • Runs a command in a running container. More than anything else, I’ve used exec to run a bash session (docker exec -it /bin/bash).
  • docker image ls
    • Lists images on your machine.
  • docker image prune
    • Removes unused images from your machine. Especially when building new images, I’ve found myself constantly wanting a clean slate. Combining prune with other commands helps clear up the clutter.
  • docker inspect
    • Outputs JSON formatted details about a given container. More than anything else I look for IP address via (docker inspect | grep IPAddress)
  • docker pull
    • Downloads a given image from a remote repository. For development purposes, docker compose will abstract this away, but if you want to run an external tool or run project on a new machine you’ll use pull.
  • docker ps
    • Without any flags, this lists all running containers on your machine. I’m constantly tossing on the ‘-a’ flag to see what containers I have across the board. While you are building a new image you inevitably have containers spawned from it exit prematurely due to some runtime error. You’ll need to do ‘docker ps -a’ to look up the container.
  • docker push
    • Once you have an image ready to be distributed/deployed you’ll use push to release it to either docker hub or a private repository.
  • docker rm
    • Removes an unstarted container from your system. Need to run docker stop first if it is running.
  • docker rmi
    • Removes an image. May need to add on the ‘–force’ flag to force removal if it is in use (provided you know what you are doing).
  • docker run
    • Runs a command in a new container. Learning the various flags for the run command will be extremely useful. The flags I’ve been using heavily are as follows:
      • –rm – Removes the container after you end the process
      • -it – Run the container interactively
      • –entrypoint – Override the default command the image specifies
      • -v – Maps a host volume into the container. For development, this allows us to use the image’s full environment and tools, but provide it our source code instead of production build files.
      • -p – Maps a custom port (ie. 8080:80)
      • –name – Gives the container a human readable name which eases troubleshooting
      • –no-cache – Forces docker to reevaluate each step when it runs the container, as opposed to using caching.
  • docker version
    • Outputs both the client vs. server versions of docker being run. This isn’t the same as ‘-v’.
  • docker volume ls
    • While there are variants on volumes, so far I mostly use the ‘ls’ command to list current volumes for troubleshooting. I’m sure there will more to come with using volumes.