Datastore is a NoSql offering from Google. It’s part of their Google Cloud Platform (GCP). The big mind shift, if you’re used to a relational database is to remember that each row (although they aren’t really rows) in a table (they aren’t really tables) can be different. The best way I could think about it was a text document; each line can have a different number of words, numbers and symbols.
However, just because it isn’t relational, doesn’t mean you don’t have to consider the structure; in fact, it actually seems to mean that there is more onus on the designer to consider what and where the data will be used.
In order to follow this post, you’ll need an account on GCP, and a Cloud Platform Project.
Set-up a New Cloud Datastore
The first thing to do is to set-up a new Datastore:
The next step is to select a Zone. The big thing to consider, in terms of cost and speed is to co-locate your data where possible. Specifically with data, you’ll incur egress charges (that is, you’ll be charged as your data leaves its zone), so your zone should be nearby, and co-located with anything that accesses it. Obviously, in this example, you’re accessing the data from where your machine is located, so pick a zone that is close to where you live.
In Britain, we’re in Europe-West-2:
Entities and Properties
The next thing is to set-up new entity. As we said, an entity is loosely analogous to a table.
Now we have an entity, the entity needs some properties. This, again, is loosely analogous to a field; if the fields were not required to be consistent throughout the table. I’m unsure how this works behind the scenes, but it appears to simply null out the columns that have no value; I suspect this may be a visual display thing.
You can set the value (as above), and then query the data, either in a table format (as below):
Or, you can use a SQL like syntax (as below).
In order to access the datastore from outside the GCP, you’ll need a credentials file. You;ll need to start off in the Credentials screen:
In this instance, we’ll set-up a service account key:
This creates the key as a json file:
The file should looks broadly like this:
"private_key": "-----BEGIN PRIVATE KEY-----\nkeydata\n-----END PRIVATE KEY-----\n",
"client_email": "[email protected]",
Keep hold of this file, as you’ll need it later.
There is a .Net client library provided for accessing this functionality from your website or desktop app. What we’ll do next is access that entity from a console application. The obvious first step is to create one:
Remember that credentials file I said to hang on to; well now you need it. It needs to be accessible from your application; there’s a number of ways to address this problem, and the one that I’m demonstrating here is probably not a sensible solution in real life, but for the purpose of testing, it works fine.
Copy the credentials file into your project directory and include it in the project, then, set the properties to:
Build Action: None
Copy to Output Directory: Copy if Newer
GCP Client Package
You’ll need to install the correct NuGet package:
Your Project ID
As you use the GCP more, you’ll come to appreciate that the project ID is very important. You’ll need to make a note of it (if you can’t find it, simply select Home from the hamburger menu):
All the pieces are now in place, so let’s write some code to access the datastore:
// Your Google Cloud Platform project ID
string projectId = "my-project-id";
DatastoreClient datastoreClient = DatastoreClient.Create();
DatastoreDb db = DatastoreDb.Create(projectId, "TestNamespace", datastoreClient);
string kind = "MyTest";
string name = "newentitytest3";
KeyFactory keyFactory = db.CreateKeyFactory(kind);
Key key = keyFactory.CreateKey(name);
var task = new Entity
Key = key,
["test1"] = "Hello, World",
["test2"] = "Goodbye, World",
["new field"] = "test"
using (DatastoreTransaction transaction = db.BeginTransaction())
If you now check, you should see that your Datastore has been updated:
There’s a few things to note here; the first is that you will need to select the right Namespace and Kind. Namespace defaults to [default], and so you won’t see your new records until you select that.
When things go wrong
The above instructions are deceptively simple; however, getting this example working was, by no means, straight-forward. Fortunately, when you have a problem with GCP and you ask on StackOverflow, you get answered by Jon Skeet. The following is a summary of an error that I encountered.
System.InvalidOperationException: ‘The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.’
The error occurred on the BeginTransaction line.
The ConsoleLogger above isn’t just there for show, and does give some additional information; in this case:
D1120 17:59:00.519509 Grpc.Core.Internal.UnmanagedLibrary Attempting to load native library “C:\Users\pmichaels.nuget\packages\grpc.core\1.4.0\lib\netstandard1.5../..\runtimes/win/native\grpc_csharp_ext.x64.dll” D1120 17:59:00.600298 Grpc.Core.Internal.NativeExtension gRPC native library loaded successfully. E1120 17:59:02.176461 0 C:\jenkins\workspace\gRPC_build_artifacts\platform\windows\workspace_csharp_ext_windows_x64\src\core\lib\security\credentials\plugin\plugin_credentials.c:74: Getting metadata from plugin failed with error: Exception occured in metadata credentials plugin.
It turns out that the code was failing somewhere in here. Finally, with much help, I managed to track the error down to being a firewall restriction.