We need better interoperability between dynamic and statically compiled C#

It’s 2013 now, .NET DLR has been out for more than three years, and there are plenty of great libraries around with clever mix of statically compiled and dynamically evaluated code. I am guilty of one myself.

Unfortunately ‘dynamic’ keyword doesn’t only bring joy to the art of programming. Its power often results in unintended behavior that can usually be explained by digging deep into language implementation, but still looking unreasonable.

Look at the interface below. its first method returns a reference to the interface itself, and the second method returns a result, so both methods can be chained:

public interface IFluentClient
{
    IFluentClient SaveText(string text);
    string GetText();
}

Now in order to be able to use this interface we need a class that implements it. Here it is:

class FluentClient : IFluentClient
{
    private string _text;

    public IFluentClient SaveText(string text)
    {
        _text = text;
        return this;
    }

    public string GetText()
    {
        return _text;
    }
}

This class is not public and we can’t instantiate it in a different assembly, so we will create a factory for it:

public class ClientFactory
{
    public IFluentClient GetFluentClient()
    {
        return new FluentClient();
    }
}

Finally we can write a test for the code above:

[TestMethod]
public void SaveAndGetText()
{
    var client = new ClientFactory().GetFluentClient();
    var text = client.SaveText("a").GetText();
    Assert.AreEqual("a", text);
}

So far so good. The test passes of course. Now we will send a dynamic object to SaveText:

[TestMethod]
public void SaveAndGetText()
{
    dynamic str = "a";
    var client = new ClientFactory().GetFluentClient();
    var text = client.SaveText(str).GetText();
    Assert.AreEqual("a", text);
}

Let’s run the test and it will fail with the following error:

Microsoft.CSharp.RuntimeBinder.RuntimeBinderException: 'object' does not contain a definition for 'GetText'

Out of curiosity I tested this code on Mono, and it failed too, with a stranger error:

FluentClient.GetText() is inaccessible due to its protection level

Obviously a string casted to a dynamic object confused the runtime. But why? Eric Lippert in a blog post explained some of the reasons why the DLR-to-CLR interoperability was implemented in a way that makes the runtime treat statically compiled code as dynamic after a dynamic object enters the scene. He defends the decision by stating that “if you and I and the compiler know that overload resolution is going to choose a particular method then why are we making a dynamic call in the first place?” This argument was probably fair enough some time ago when developers were still new to DLR and used dynamic C# only on special occasions. I’d say this is no longer a good argument. It’s easy and convenient to mix statically compiled and dynamic C#, and the language should make their interoperability more natural.

How can we make the failing test work? For example by breaking a fluent call chain:

[TestMethod]
public void SaveAndGetText()
{
    dynamic str = "a";
    var client = new ClientFactory().GetFluentClient();
    client = client.SaveText(str);
    var text = client.GetText();
    Assert.AreEqual("a", text);
}

Suddenly it all works, even with a use of dynamic object. But we can even make the original code work without changing a single line in the test code! The strange error message on Mono that complained about protection level gave me a hint: what if I change the visibility of FluentClass changing its protection level from private to public? Bingo! Suddenly all tests passed. Both on Windows and Mono platforms. But who could expect such behavior?

Use of dynamic code requires more thorough testing. But as you could see in examples above, it sometimes requires rewriting some calls or class definitions without changing any functionality, only because .NET runtime accepts one but not other perfectly valid syntax. This makes interoperability between CLR and DLR fragile. I believe there is a need to make it more predictable and developer-friendly.

Announcing Simple.OData.Client NuGet package for Mono iOS/Android

Release of NuGet 2.5 opens possibilities for targeting Mono iOS/Android platform. The Simple.OData.Client that I maintain is a portable class library, so it was natural step to extend the number of supported frameworks with Xamarin Mono offerings. But as it often happens, I hit a few problems before I could publish the updated package (version 0.15).

One problem I had to deal with was forwarding of the types that are packaged differently in Windows and .NET. Luckily, this problem was addressed earlier by other people, so big thanks to Stuart Lodge (@slodge) and Daniel Plastied (@dsplaisted) for investigating this topic and describing workarounds. Those interested in more details should read this article at StackOverflow. I am now using the same approach that Stuart implemented in MvvmCross, and here is how the file sets for Mono platforms look:

<!—Droid –>
<file src="Simple.OData.Client.CorebinReleaseSimple.OData.Client.Core.dll" 
      target="libMonoAndroid16Simple.OData.Client.Core.dll" />
<file src="libDroidSystem.Net.dll" target="libMonoAndroid16System.Net.dll" />

<!-- Touch –>
<file src="Simple.OData.Client.CorebinReleaseSimple.OData.Client.Core.dll" 
      target="libMonoTouch40Simple.OData.Client.Core.dll" />
<file src="libTouchSystem.Net.dll" target="libMonoTouch40System.Net.dll" />

Next challenge came from a couple of places in my code where I was using async/await pattern. Since Simple.OData.Client is targeting wide range of platforms including .NET 4.x and Silverlight, I installed Microsoft.Bcl.Async NuGet package that enabled use of async/await keywords when targeting legacy platforms. However, for the time being the files from this package can not be deployed on iOS and Android devices (legal reasons), and Xamarin async/await support is a work in progress. After struggling for a few days with various async issues (only compile-time issues, I haven’t had a single runtime problem with asynchronous operations), I reviewed the code and figured out that it’s so few places where I was using await, that I can rewrite this code replacing await with TPL API almost in no time. So now Simple.OData.Client does not bear any dependency on Microsoft.Bcl.Async.

Finally, I had to write new tests to verify functionality on new platforms. I used an excellent little helper library MonoDroidUnitTest for Android tests, and for iOS there was a Unit Test project template in Xamarin tools. Since I don’t have Mac, I signed in to MacInCloud to test using iOS simulator. It kind of works, but the performance is sluggish so I don’t think this is a good alternative for a serious iOS development (but OK for occasional validation). It was easier with Android – I could simply start a Droid simulator, loaded an app with a few tests, and in a few seconds (well, actually minutes) I could see the following picture:

Android2 

So in case you are doing OData development and need a library that you can use on multiple platforms, have a look at Simple.OData.Client. Now with Mono support.

Cross-platform design-time view models using portable class libraries

Stuart Lodge is working on a fantastic series of videos and blog posts showing how to build cross-platform mobile applications with MvvmCross. One of his tips is about exposing design-time data. His method is straightforward and efficient, but in case you don’t want to copy sample files to a location inside Windows Program Files folder, you may consider other alternatives, such as storing design-time information as project content files or embeding it into an assembly.

This task becomes more complicated in case you want to show some images at design-time, and BitmapImage class is not part of portable class libraries. I used to solve it by representing view model image properties using opaque object class and setting its values in platform-specific part of the view models. The unfortunate consequence of this approach was declaring a view model per platform serving design-time purposes. The platform view models inherited from a common portable view model and only added a tiny bit of non-portable logic – like reading image resources. Here is how I used to show design-time data in my ODataPad application:

public class ServiceViewModel 
{ 
    public string Name { get; set; } 
    public string Description { get; set; } 
    public string Url { get; set; } 
    public object Image { get; set; } 
} 

public class WinRTDesignHomeViewModel : DesignHomeViewModel 
{ 
    public WinRTDesignHomeViewModel() 
    { 
        foreach (var service in this.Services) 
        { 
            service.Image = new BitmapImage(
                new Uri("ms-appx:///Samples/" + service.Name + ".png")); 
        } 
    } 
} 

public class Net45DesignHomeViewModel : DesignHomeViewModel 
{ 
    public Net45DesignHomeViewModel() 
    { 
        foreach (var service in this.Services) 
        { 
            service.Image = new BitmapImage(
                new Uri(@"pack://application:,,,/ODataPad.UI.Net45;component/Samples/" + service.Name + ".png")); 
        } 
    } 
}

XAML files for Windows Store and WPF applications included Image element with binding to an Image property of ServiceViewModel.

<Image Source="{Binding Image}" Stretch="UniformToFill" />

Here are the screenshots of design-time views:

ODataPadWinRT 

ODataPadNet45

ODataPadWP8

This method works but as I already pointed out, it feels heavy because each design-time data set is backed with its own design-time view model – due to non-portability of image .NET types and image management methods.

But there is another way – storing image properties using portable data types, and what can be more portable than a standard string? Enter base64 strings.

Here’s a revised ServiceViewModel class with image data stored as strings:

public class ServiceViewModel 
{ 
    public string Name { get; set; } 
    public string Description { get; set; } 
    public string Url { get; set; } 
    public string ImageBase64 { get; set; } 
}

We no longer need WinRTDesignHomeViewModel  and Net45DesignHomeViewModel classes. Instead we will store all design-time information in a portable class library as an embedded resource. Not that Assembly.GetManifestResourceStream method is part of most of portable class libraries profiles, so we can share the following PCL code between all platforms:

public DesignHomeViewModel()
{
    IEnumerable services = null;
    var namespaceName = typeof(DesignHomeViewModel).Namespace;

    var stream = typeof (DesignHomeViewModel).Assembly
        .GetManifestResourceStream(string.Join(".", namespaceName, "SampleServices.xml"));
    using (var reader = new StreamReader(stream))
    {
        this.Services = SamplesService.ParseSamplesXml(reader.ReadToEnd());
    }
}

Wait, but that can’t be sufficient – our XAML views contain Image elements, we can’t just throw base64 strings at them. Well, almost: it’s all about converters.

Here’s revised XAML code:

<Image Source="{Binding ImageBase64, Converter={StaticResource Base64ToImage}}" Stretch="UniformToFill"/>

And these are converers for Windows Store, WPF and Windows Phone applications:

namespace ODataPad.UI.WinRT.Common
{
    public sealed class Base64ImageConverter : IValueConverter
    {
        public object Convert(object value, Type targetType, object parameter, string language)
        {
            return ConvertAsync(value, targetType, parameter, language).Result;
        }

        public object ConvertBack(object value, Type targetType, object parameter, string language)
        {
            return null;
        }

        public async Task ConvertAsync(object value, Type targetType, object parameter, string language)
        {
            var image = new BitmapImage();

            if (value != null)
            {
                var bytes = System.Convert.FromBase64String((string)value);

                var ras = new InMemoryRandomAccessStream();
                using (var writer = new DataWriter(ras.GetOutputStreamAt(0)))
                {
                    writer.WriteBytes(bytes);
                    await writer.StoreAsync();
                }

                image.SetSource(ras);
            }
            return image;
        }
    }
}

namespace ODataPad.UI.Net45.Common
{
    public class Base64ImageConverter : IValueConverter
    {
        public object Convert(object value, Type targetType, object parameter, CultureInfo culture)
        {
            var image = new BitmapImage();
            if (value != null)
            {
                var bytes = System.Convert.FromBase64String((string)value);

                image.BeginInit();
                image.StreamSource = new MemoryStream(bytes);
                image.EndInit();
            }
            return image;
        }

        public object ConvertBack(object value, Type targetType, object parameter, CultureInfo culture)
        {
            return null;
        }
    }
}

namespace ODataPad.UI.WP8.Common
{
    public class Base64ImageConverter : IValueConverter
    {
        public object Convert(object value, Type targetType, object parameter, CultureInfo culture)
        {
            var image = new BitmapImage();
            if (value != null)
            {
                var bytes = System.Convert.FromBase64String((string)value);
                image.SetSource(new MemoryStream(bytes));
            }
            return image;
        }

        public object ConvertBack(object value, Type targetType, object parameter, CultureInfo culture)
        {
            return null;
        }
    }
}

Now all my design-time data are stored in a single portable class library and I display them using a single view model also declared in a PCL. All platform-specific data such as images are stored as strings and only converted to bitmaps when rendering views using value converters.

Introducing Simple.OData.Client: a portable OData client library for .NET4.x, Windows Store, Silverlight 5 and Windows Phone 8

Last week I was busy with my home project: creating an OData portable class library. This project originated from the Simple.Data OData adapter when I needed a library for Windows Store application and didn’t find any suitable one. And since Simple.Data only supports .NET 4.x platforms, I started extracting parts of the adapter to make a PCL.

As it always happens, I spent far longer time to complete this task, but it was worth it even the only outcome was code cleanup. I thought the quality of Simple.Data OData adapter code was pretty good, but when I was forced to separate non-portable portions of it, I realized that this was a separation of concerns. When you are writing code that targets multiple platforms, it’s no longer just your subjective decision of how much responsibility you can delegate to a certain class or module. If your occasionally brought stuff that does not meet portability criteria – logging, creating user credentials, external data access – you will have to remove it. Otherwise your code won’t compile.

Of course running the code through portability check won’t expose all violations of clean code principles, moreover – there may be plenty of reasons for code to not be portable. But in many cases it’s a good check, and we should probably ask yourself a question “why this code is not portable?” more often.

But enough for lessons and principles. The portable OData client library is here, and Mark Rendle was kind enough to let me keep the “Simple” prefix in the library name, although this word is kind of his trademark now. (His remark was “as far as I’m concerned it all grows the brand Smile“). And I hope that simplicity of Simple.OData.Client API is on the level of other members of the family of Simple frameworks.

Simple.OData.Client is an open source library, available at GitHub. In addition, it has a NuGet package. If you install it from NuGet, depending on your project target framework you will get a reference to one of two assemblies:

  • Simple.OData.Client.Net40.dll: for target frameworks .NET 4.0, .NET 4.0.3, .NET 4.5;
  • Simple.OData.Client.Core.dll: for portable libraries targeting .NET 4.0.3, .NET 4.5, Windows Store, Silvelight 5, Windows Phone 8 and for target frameworks Windows Store, Silverlight 5 and Windows Phone 8.

Support for Android and iOS (via Xamarin Mono) is in my plans (in fact, AFAIK the portable version may be used to target Mono for Android).

The project has Wiki pages with many examples, it has two API flavors: basic and fluent. Below is an a example of using its fluent API to retrieve data from NuGet OData feed:

var client = new ODataClient("http://packages.nuget.org/v1/FeedService.svc/");
var x = ODataFilter.Expression;
var packages = client
    .For("Packages)
    .Filter(x.Title == "Simple.OData.Client")
    .FindEntries();

foreach (var package in packages)
{
    Console.WriteLine(package["Title"]);
}

Simple.OData.Client supports all HTTP methods used by OData protocol (GET/POST/PUT/MERGE/DELETE) including support for batch requests.

Converting a Git repository to TFS

Recently we had a task of converting a Git repo to a TFS. The original source code repository was managed under TFS, then during a trial period we ran it using Git (exported from TFS). However, due to organizational policies we had to switch back to TFS. At the same time our project was assigned a new name in TFS, so we had to create a new TFS source repository and preserve Git commit history. Below is a description of how we managed to achieve this. We used Git-Tf open-source tool released by Microsoft.

1. Cloning an existing Git repo

A new copy of a Git repo was created to be used as an import source. This was done using a trivial Git command:

git clone //gitserver/Source/OurProject.git OurProject.Git.Tfs

So the OurProject.Git.Tfs is the name of a new repository that will be converted to TFS.

2. Configuring Git-Tf to establish a link between Git and TFS repositories

The following command instructs Git-Tf to establish a link between the Git and TFS repos:

git-tf configure http://tfsserver:8080/tfs $/OurProject/Master

3. Importing Git commit history… if it works

The following command should convert to TFS all your Git commits, assuming they have a linear form:

git-tf checkin –deep

Most likely this won’t work for any Git repo with non-rebased branches. Then you have a choice of either import all Git commits as a single changeset (losing the history) or add some more efforts to transform the history into a shape suitable for TFS export. But first…

4. Ensuring Git commit messages are not empty

If you try to rebase a Git repo where some commit messages are empty (bad practice!), you are going to fail. But you can fix the messages by running a script similar to the one I’ve found on StackOverflow:

git filter-branch -f --msg-filter '
read msg
if [ -n "$msg" ] ; then
    echo "$msg"
else
    echo "The commit message was empty"
fi'

If your repository is large, it will take some time, but after that you can move on to rebase step.

5. Rebasing a Git repo to make a linear commit history

Git and TFS have significant differences in its architecture, so in order for Git commit history to be swallowed by TFS branch, it has to have a linear form. This can be achieved by finding a last commit prior the first branch (i.e. find the very first (oldest) branch and take a commit prior to that) and using its hash as a parameter to an interactive rebase command:

git rebase -i 5fa13f88cf61b37fea760d24e78819383a8df8ca

This method was also suggested at StackOverflow.

Git interactive rebase tool will popup with a long list of actions. Don’t change anything there, just save them, and the rebase commences. It may interrupt quite a few times complaining about merge conflicts. Resolving the conflicts is on you, but usually they are trivial and can be resolved just by invoking “git add .” or “git add add –u” followed by a “git commit”. In the end you will have a rebased Git repo that is ready to be exported to TFS.

6. Importing Git commit history… second attempt

So now you type again the checkin command:

git-tf checkin –deep

Worked? Then you’re lucky. Because in our case it didn’t due to a bug in the current version of Git-Tf tool. If your commits include renames with moving files between directories, the tool gets confused and stops with a message like this:

Git-tf: failed to pend changes to TFS due to the following errors. Please fix the errors and retry check in.
The item $/OurProject/Master2/Backend/Client/Service References/Entities already exists.

We contacted Microsoft team, they admitted it is a bug in the Git-Tf tool but suggested a workaround:

git tf checkin --deep --renamemode=none

This workaround causes renames to be handled as deletions followed by insertions – not quite right but acceptable in our case, because we only need to preserve historical code snapshots and not how they were reached.

After the last step a TFS repository with full commit history should be established, and developers may either use it directly or keep Git-Tf to use Git locally and send it to TFS using this tool.

Check your code portability with PCL Compliance Analyzer

I am extracting parts of my Simple.Data OData adapter to make a portable class library (PCL). The goal is to create an OData library available for desktop .NET platforms, Windows Store, Silverlight, Windows Phone and even Android/iOS (using Xamarin Mono). To study platform-specific PCL capabilities I used an Excel worksheet provided by Daniel Plaisted (@dsplaisted). It’s a very helpful document, but it would be much easier if I could simply point some tool to an assembly file and it would show its portable and non-portable calls.

I haven’t found such tool, so I wrote one. It’s called PCL Compliance Analyzer. Select an assembly file, set of platforms you want to target, and it will show you if the assembly is PCL compliant and what calls are not. Here are the results for Mono.Cecil (that I used to scan assembly calls):

MonoCecil

As you can see, only about 10% of Mono.Cecil is not portable to any of the .NET platforms.

Let’s try something completely different. Here’s RavenDB.

RavenDbAll

Note that non-portable calls include calls to third-party libraries, like NLog. This is not interesting because those third-party assemblies are subject to a separate check, and it’s reasonable to limit portability compliance check to only system API. So we check “Exclude third party libraries” check box and repeat the scan. Here’s the new result:

RavenDbNoExt

Exclusion of third-party libs reduce the number of non-portable calls from 1642 to 674. But we are targetting all platforms (even Xbox 360), and this is perhaps not a very good idea for a NoSQL database with LINQ support. If we limit supported platforms to only .NET 4.5, we will end up with this picture:

RavenDbNet45

Of course the choice of one of RavenDB files is completely random here. I just had an old project folder with files left from Rob Ashton’s workshop on RavenDB, so I used one of the assembly files to demonstrate what you can do with PCL Compliance Analyzer.

Finally, if you point it to a true PCL assembly and select right set of target platforms, PCL Analyzer will confirm that all calls are portable. Like it did in case of my Simple.OData.Client:

SimpleODataClient

PCL Compliance Analyzer is an open-source project hosted at GitHub, and if you only need its binaries, you can download them here.

Managing type inheritance with Simple.Data OData adapter

Type inheritance is not frequently implemented in OData services. I’d say that REST services is not a very good fit for OO paradigm and there are other means to expose information in a REST service than to open up the whole object hierarchy. However, today the easiest way to build an OData service is to write a tiny wrapper around Entity Framework model using WCF Data Services. So if OData didn’t have support for inheritance, it would leave out a wide range of real-world data models. Therefore OData and WCF Data Services has full support for type inheritance, and so do the Simple.Data OData adapter. Initially inheritance was added in the version 0.11 at request from Gianni Mellon (thanks for a valuable feedback), and the version 0.12 fixed a few bugs related to update and deletion of inherited type instances.

Consider the following type hierarchy:

Model

“Transport” is an abstract type, and we want to be able to retrieve and modify both derived types: “Ships” and “Trucks”. Here’s how you can do it with Simple.Data and its OData adapter.

1. Retrieving all transport instances (both ships and trucks)

IEnumerable<dynamic> transports = _db.Transport.All();

Then you can traverse the resulting collection and access either ShipName or TruckNumber properties depending on the concrete type of each element.

2. Retrieving only derived type (ships) instances

IEnumerable<dynamic> ships = _db.Ships.All();

Simple, isn’t? It’s Simple.Data. Now let’s look up just one ship:

var ship = _db.Ships.Find(_db.Ships.ShipName == "Titanic");

3. Creating new entries

var ship = _db.Ships.Insert(ShipName: "Test1");

4. Updating entries

_db.Ships.UpdateByTransportID(TransportID: ship.TransportID, ShipName: "Test2");

or

var ship = _db.Ships.Find(_db.Ships.ShipName == "Titanic");
ship.ShipName = "Not Titanic";
_db.Ships.Update(ship);

5. Deleting entries

_db.Transport.DeleteByTransportID(ship.TransportID);

or

var ship = _db.Ships.Find(_db.Ships.ShipName == "Titanic");
_db.Ships.Delete(ship);

The latest Simple.Data.OData package is available from NuGet.

Simple.Data OData adapter: approaching version 1.0

It’s January the first, but when I looked at my Twitter feed I realized I am lazy. People are discussing programming bugs, API design flaws and book chapters they are writing. I need to pretend I’m one of them and do something useful. And it’s a good opportunity for me to say a few words about the current status of Simple.Data OData adapter, an open source library that I’ve been developing in my free time whole last year. So here we go.

Is Simple.Data OData adapter finished?

It’s not a good idea to use the word “finished” when talking about software, but if you mean “production ready”, then yes. Although it’s current NuGet package version is 0.8.4, it will be updated to 1.0 as soon as its master project Simple.Data version 1.0 will be released. Current OData adapter is a release candidate and I am not aware of any issues with it.

What about documentation then?

I was waiting for this question, and I am well prepared! You will find a set of Wiki pages at GitHub describing every single scenario of using the adapter. I am copying here the table of content:

Getting started with Simple.Data OData provider

Retrieving data

Retrieving all data from a collection
Retrieving all data matching search criteria
Retrieving all data matching search criteria specified in a method name
Retrieving a single row matching search criteria
Retrieving a single row matching search criteria specified in a method name
Retrieving a single row by key
Expanding results with linked entries
Retrieving linked entries without fetching its owners
Including standard functions in search criteria
Results projection, paging and ordering

Modifying data

Adding entries
Adding entries with links
Updating entries
Updating entries with links
Deleting entries
Linking and unlinking entries
Batch updates

What’s next?

Recently I have split the OData adapter project in two parts: one (Simple.OData.Client) is the actual OData client with both C# dynamic and more traditional fluent API, and the second (Simple.Data.OData) is a thin wrapper around the first one to expose Simple.Data client API. The motivation for this change was the work on a Windows Store application with OData client support that couldn’t use Simple.Data because it does not run on WinRT platform. To make this happen I began working on Simple.OData.Client aiming to release it as a portable class library (PCL) that could be run on various platforms including Windows Phone. At the present time there’s only a NET 4.0 Simple.OData.Client NuGet package, but WinRT, NET 4.5 and Windows Phone versions will follow soon.

Using OData protocol V3 with MongOData OData provider

In a recent post I showed how MongOData (it’s a MongoDB OData provider that I wrote and maintain) exposes BsonArray MongoDB properties. Support for arrays of primitive and complex types has been added to OData protocol in its latest version 3, but many clients and code generation tools haven’t been enhanced with this feature, so they will fail accessing a MongoDB collection where some items are exposed as arrays.

Moreover, due to the nature of document databases, array is a natural way of exposing collections of items, versus one-to-many relationship used in relational database. So the chances are pretty high that an arbitrary MongoDB repository will contain documents that will break old generation client tools.

Let me show you this on a simple example. I’ve been working on a set of MongOData samples that would illustrate how to set up and use the provider, and once I tried to generate a WCF Data Services client proxy using Visual Studio 2010, it showed me the following error message:

DataServiceVersionError

The offending part was the Product class definition that contained an array of Suppliers:

public class Supplier
{
    public string Name { get; set; }
    public Address[] Addresses { get; set; }
}

So what can you do to overcome such limitation? It’s really silly not to be able to retrieve data structure that both service can expose and your code can consume when all that stops you is a man in the middle: a proxy generation tool.

My answer to this is simple: bypass proxy generation. OData protocol follows REST principles, and proxy generation is so SOAP. Just use C# dynamics so the service metadata will be evaluated at runtime.

One simple approach to achieve this is to use Simple.Data OData adapter. Here’s the code that retrieves the content of MongoDB collection (containing arrays) using Simple.Data:

Console.WriteLine("Connecting to MongoDB sample OData service...");
dynamic context = Database.Opener.Open("http://localhost:50336/MongoDataService.svc/");

Console.WriteLine("Retrieving categories...");
Console.WriteLine();
foreach (var category in context.Categories.All().ToList())
{
    Console.WriteLine("Category: ID=[{0}], Name=[{1}]",
        category.ID,
        category.Name);
}
Console.WriteLine();

Console.WriteLine("Retrieving products...");
Console.WriteLine();
foreach (var product in context.Products.All().ToList())
{
    Console.WriteLine("Product: ID=[{0}], Name=[{1}], Category=[{2}], Quantity=[{3} {4}], " +
                      "ReleaseDate=[{5}], DiscontinueDate=[{6}], Suppier=[{7}]",
        product.ID,
        product.Name,
        product.Category.Name,
        product.Quantity.Value,
        product.Quantity.Units,
        product.ReleaseDate,
        product.DiscontinueDate,
        product.Supplier == null ? null : product.Supplier.Name);
}

I created MongOData.Samples solution with the following projects:

  • CreateSampleData – creates a sample MongoDB database (default connection string is set to mongodb://localhost/mongodatasamples, you can of course change it), during creation is asks if you want to enable OData protocol V3 support;
  • SampleService – WCF Data Service provider that uses MongOData to connect to a MongoDB instance;
  • SampleWcfDataServiceClient – OData client that uses proxy classes generated by Visual Studio, so it can not consume services that use OData V3 features such as arrays;
  • SampleDynamicClient – proxy-less client that uses Simple.Data OData adapter and support OData V3 features.

You can download sample solution here.

Just another BloggingAbout.NET site