Start | MVPs 

        MVP Weblogs

door: Jan Karel Pieterse
RefTreeAnalyser 2.0 has just been updated. I have added support for hotkeys using the Alt key.
comments 5/6/2015 7:30:00 PM

door: Jan Karel Pieterse
Today I posted an entirely new article. It describes all the steps needed to create a basic Excel add-in.
comments 5/4/2015 4:10:00 PM

door: Jan Karel Pieterse
Today I posted a link to one of the most comprehensive sets of Excel productivity tools I have ever seen: Andrew's Excel Utilities
comments 4/7/2015 10:30:00 AM

door: Jan Karel Pieterse
Our event is coming real soon now and we're very much looking forward to it. We have outstanding speakers and excellent content, so everything is lined up to make this a superb Excel event. To entice the undecisive Excel lovers to make up their minds and subscribe after all, we decided to make it even more attractive to attend. As of March 27st, 2015 we offer a € 200 last-minute discount per attendee for both days and € 100 for one day! Register now at http://topexcelclass.com/index.php/amsterdam-excel-summit/registration/ and meet us on April 13th and 14th in Amsterdam.
comments 3/27/2015 9:20:00 AM


We’ve released LLBLGen Pro  v5.0 CTP 1! It’s a Community Technical Preview (CTP) of the upcoming v5.0 which is in development since fall 2014. The CTP is open for all v4.x customers (it’s in the customer area, in the v4.2, betas section) and comes with a time-limited license which expires on June 1st, 2015. As this isn’t a full beta, (much) more features are added before beta hits.

Below I’d like to show some of the new features in action. Click the screenshots for a bigger version.

New, skinnable UI

A new UI was long overdue. The current (v4.2) version still uses VS.NET 2008 like toolbars/controls and it looks just… dated. So, we ported the complete UI to DevExpress controls (still winforms though) as some of our controls were already based on those. The screenshots below show the default VS.NET 2013 white theme.

New Relational Model Data Sync System

In short: the Sync system. Sync replaces both database first related actions like refresh catalog and model first related actions like auto-map and adjust relational model data. It allows a sync source to be set for a schema which controls the source from where table related relational model data is obtained from: the database or the entity model. Stored procedures/views/tvfs are always obtained from the database. Everything is managed from a single tab, the Sync Relational Model Data tab, which is opened by clicking the sync button on the toolbar or menu item in the project menu.


A big benefit from the new system is that it will function even when the project contains errors: it's no longer necessary to correct project elements before a refresh. It also doesn't adjust relational model data on database synced schemas, so it's no longer required to export DDL SQL before code generation because the validation adjusted some fields based on a change.

Revamped Home tab

The Home tab now shows active tiles the user can click to navigate through one of the scenarios (database first / model first and new / existing project). Which tiles are shown depends on what the user did last and the state of the designer.


It's an easy way to get started with the designer and replaces the webpage based home tab which was static.

It acts like a wizard in disguise: you can do the basic tasks right from the home tab, by clicking tiles. Here I’ve opened a Database first project and the state of the designer now shows different tiles with different actions.


Search in Project Explorer / Catalog Explorer

Directly available in the Project Explorer and Catalog Explorer are the search boxes: type in any string and the nodes matching the string are kept, all other nodes are filtered out. This allows finding elements with ease. Removing the search replaces the tree as it was before the search. In the below screenshot I’ve searched for ‘Sales’ in both the project explorer and the catalog explorer. It shows all nodes matching the string specified, including parent nodes (to give context) and hiding other nodes. The more advanced, LINQ query based search is still available in the designer, but this dedicated search on the two explorer panes is easier to use and requires no extra forms to navigate.


Real Time Validation

The designer got a real time system to schedule and run tasks at will through several dedicated dispatch queues. This greatly helps offload work from the UI thread while not having to mess with multi-threading as it utilizes the .NET 4.5 Task Parallel Library. Configuring work is as easy as defining an event, a handler and which dispatch queue to run the handler call and the system takes care of the rest, including overflow protection (so only a limited number of calls are allowed per interval).

The real-time validation in action:


So these are the highlights of this CTP (there are many tiny improvements under the hood too). We’ve much more in store for v5.0, so stay tuned Smile

comments 3/17/2015 4:20:11 PM

door: Jan Karel Pieterse
Extended early bird discount on our Amsterdam Excel Summit! Last week our mail server experienced some problems, which meant that the registration form did not work. To compensate for anny inconvenience caused by this problem we have decided to extend the early bird discount period with an additional week. So register before March 1st 2015 and receive the 50 Euro discount!
comments 2/23/2015 3:40:00 PM

door: Jan Karel Pieterse
In my Circular Reference article I promised to include an example of a VBA driven circular reference calculation. I have done that today so head over to my site to check it out!
comments 2/11/2015 10:30:00 AM

door: Jan Karel Pieterse
Join us in Amsterdam on April 13th and 14th 2015, for the second Amsterdam Excel Summit! We have a lot of interesting subjects, including: Business Intelligence in Excel vNext, An introduction to Power Query, working with Excel charts less painful, Professionalising your Power Pivot Model, More efficient VBA UDFs, ...
comments 2/10/2015 6:20:00 PM

door: Jan Karel Pieterse
Name Manager for Excel has been updated. Improved renaming rangenames in VBA code.
comments 2/10/2015 6:20:00 PM


I caved. For years I’ve denied requests from customers to publish the LLBLGen Pro runtime framework assemblies on nuget, for the reason that if we had to introduce an emergency fix in the runtimes which also required template changes, people with dependencies on the nuget packages would have a problem. While this might be true in theory, in practice it’s so uncommon that this will happen, it more and more turned into an excuse.

Add to that that customers started publishing the runtimes themselves on nuget, it was time to bite the bullet and publish the runtimes ourselves, officially. So we did. At the same time we published the interceptor assemblies of ORM Profiler on nuget.

The URLs

For LLBLGen Pro:


For ORM Profiler:


Who are these assemblies for?

The assemblies are for customers who want to stay up to date with the latest runtimes. Every time we publish a new build, the runtimes and interceptor dlls are automatically updated with the latest build. We never introduce breaking changes in released assemblies, so they’re safe to use in code and update to the latest version.

How are they versioned?

The LLBLGen Pro Runtime Framework assemblies are versioned as: 4.2.yyyymmdd, where yyymmdd is the build-date. The ORM Profiler interceptors are versioned as: 1.5.yyyymmdd.

What’s in the packages?

The DQE packages come with a single DLL, the DQE dll, and have a dependency on the ORMSupportClasses package. The ORMSupportClasses package contains both the .NET 3.5 build and the .NET 4.5 build with async support: if your project targets .NET 4.5 you automatically will reference the .NET 4.5 build with async support.

The Interceptor packages contain the interceptor dll and support dlls which don’t need their separate package. The Entity Framework interceptor has a dependency on Entity Framework 6.

Do the DQE packages depend on ADO.NET provider packages?

No, as all DQEs work with the DbProviderFactory system and don’t need the using project to reference an ADO.NET provider reference. The ADO.NET provider has to be present on the system, but as the provider assembly doesn’t need to be referenced by the VS.NET project the DQE package doesn’t need a direct dependency on the related ADO.NET provider package as that would mean the ADO.NET provider dll would be directly referenced after the DQE package has been installed.

Hope this helps the customers out who have asked us for so long for this feature Smile

comments 2/10/2015 5:33:58 PM

door: Jan Karel Pieterse
Join us in Amsterdam on April 13th and 14th 2015, for the second Amsterdam Excel Summit! An absolute unique group of Excel MVPs will gather in Amsterdam to share their expert knowledge with you. The Excel MVPs happen to be in Amsterdam for a meeting and we've succeeded in getting some of them to present at our event. Make sure you register!
comments 1/22/2015 11:15:00 AM

door: Jan Karel Pieterse
Excel VBA voor Financials versie 2; Een twee-daagse Excel VBA cursus (20 en 26 mei 2015). Bespaar tijd door het automatiseren van uw rapportages! Ontsluier de geheimen van VBA en breng uw Excel kennis en vaardigheden op ongekende hoogte !
comments 12/30/2014 2:05:00 PM


It’s likely you’ve heard about Microsoft’s release of the .NET Core source code, their announcement of ASP.NET vNext and accompanying PR talk. I’d like to point to two great articles first which analyze these bits without being under the influence of some sort of cool-aid: “.NET Core: Hype vs. Reality” by Chris Nahr and “.NET Core The Details - Is It Enough?” by Mike James.

I don’t have a problem with the fact that the ASP.NET team wants to do something about the performance of ASP.NET today and the big pile of APIs they created during the past 12-13 years. However I do have a problem with the following:

“We think of .NET Core as not being specific to either .NET Native nor ASP.NET 5 – the BCL and the runtimes are general purpose and designed to be modular. As such, it forms the foundation for all future .NET verticals.”

The quote above is from Immo Landwerth’s post I linked above. The premise is very simple, yet has far reaching consequences: .NET core is the future of .NET. Search for ‘Future’ in the article and you’ll see more reference to this remark besides the aforementioned quote. Please pay extra attention to the last sentence: “As such, it forms the foundation for all future .NET verticals”. The article is written by a PM, a person who’s paid to write articles like this, so I can only assume what’s written there has been eyeballed by more than one person and can be assumed to be true.

The simple question that popped up in my mind when I read about ‘.NET core is the future’, is: “if .NET core is the future of all .NET stacks, what is going to happen with .NET full and the APIs in .NET full?”

Simple question, with a set of simple answers:

  • Either .NET Core + new framework libs will get enough body and it will be simply called ‘.NET’ and what’s left is send off to bit heaven, so stuff that’s not ported to .NET core nor the new framework libs is simply ‘legacy’ and effectively dead.
  • Or .NET Core + new framework libs will form a separate stack besides .NET full and will co-exist like there’s a stack for Store apps, for Phone etc.

Of course there’s also the possibility that .NET core will follow the faith of Dynamic Data, Webforms, WCF Ria Services and WCF Data Services, to name a few of the many dead and burned frameworks and features originating from the ASP.NET team, but let’s ignore that for a second.

For 3rd party developers like myself who provide class libraries and frameworks to be used in .NET apps, it’s crucial to know which one of the above answers will become reality: if .NET core + new framework libs is the future, sooner or later all 3rd party library developers have to port their code over and rule of thumb is: the sooner you do that, the better. If .NET core + new framework libs will form a separate stack, it’s an optional choice and therefore might not be a profitable one. After all the amount of people, time and money we can spend on porting code to ‘yet another platform/framework’, is rather limited if we compare it to a large corporation like Microsoft.

Porting a large framework to .NET Core, how high is the price to pay?

For my company, I develop an entity modeling system and O/R mapper for .NET: LLBLGen Pro. It’s a commercial toolkit that’s been on the market for over 12 years now, and I’ve seen my fair share of frameworks and systems come out of Microsoft which were positioned as essential for the .NET developer at that moment and crucial for the future.  .NET Core is the base for ASP.NET vNext and positioned to be the future of .NET and applications on .NET Core / ASP.NET vNext will likely use data-access to use some sort of database. This means that my runtime (the LLBLGen Pro runtime framework, which is our ORM framework) should be present on .NET core. 

Our runtime isn’t small, it spans over 500,000 lines of code and has a lot of functionality, not all of which is considered ‘modern’ but not all of us develop new software: most developers out there actually do maintenance work on software which will likely be used in production for years to come. This means that what’s provided as functionality today will be required tomorrow as well. Add to that that a lot of our users write desktop applications and our framework therefore has to work on .NET full no matter what. This has the side effect that what’s in our runtime will have to stay there for a long period of time, and porting it to .NET core will effectively mean: create a fork of it for a new runtime and maintain them in parallel.

I’ve done this before, for the Compact Framework, a limited .NET framework that ran on Windows Mobile and other limited devices, so I know what costs come with a port like this:

  • research in what is not supported, which API acts differently what limitations there are and which quirks / bugs to stay away from or take into account
  • features in the .NET framework aren’t there, so you have to work around these or provide your own implementation
  • APIs are different or lack overloads so you have to create conditional compile blocks using #if
  • because not everything is possible on a limited framework you have to cut features in your own framework, limiting usability
  • less features or limited features in your own work mean you have to provide different documentation for these features to explain the differences
  • a different platform requires additional tests to make sure what changed actually works
  • additional maintenance costs for support, as issues only occurring with the additional framework require specific setups for reproducing the issue
  • supporting a new platform isn’t for a week but for a long period of time as customers take a dependency on your work for a long period of time.

For an ISV or for an OSS team these issues have a severe impact: they take time to resolve and time has a cost: you can’t spend the time on something else. In short: it’s a serious investment.

I’m not afraid to do these kind of investments. In the past I’ve spent time on things like the following: (time is full time, just development work)

  • Several months implementing DataSource controls for our runtime to be used in ASP.NET webforms. Dead: ASP.NET vNext doesn’t contain webforms support anymore. We still ship the DataSource controls though.
  • Several months on adding support for Dynamic Data in our runtime. Dead. We don’t ship support for it anymore. Customers who want it can get the source if they want to from the website.
  • Several months on adding support for WCF Ria Services in our runtime. Dead. We don’t ship support for it anymore. Customers who want it can get the source if they want to from the website.
  • Several months on adding support for WCF Data Services in our runtime. Dead, as the future is in WebAPI, which is now merged into ASP.NET vNext. We still ship the library.
  • Five months on adding support for Compact Framework. Dead. We don’t ship support for it anymore. Last version which did is from 2008.
  • Two months on adding support for XML serialization. Dead. JSon is now what’s to be used instead. We still ship full xml serialization support with multiple formats.
  • One month on adding support for Binary Serialization. Dead. JSon is now what’s to be used instead. We still ship full binary serialization support with optimized pipeline for fast and compact binary serialization of entity graphs.
  • Several weeks on adding support for WCF services in our runtime. Dead, as the future is WebAPI, which is now merged into ASP.NET vNext. We still ship support for it.
  • Several months on adding support for complex binding in Winforms and WPF: Still alive, but future is unclear (see below). We ship full support for it, including entity view classes.
  • Almost a full year on adding support for Linq in our runtime. Still alive. This was a horrible year but in the end it was worth it.
  • One month on adding a full async/await API to our runtime. Still alive. This was actually quite fun.

That’s just the runtime, and the changes required to ‘stay accurate’ according to Microsoft’s roadmap for .NET and what’s required to build a ‘modern’ application for .NET. As you can see, lots of time spend on stuff that’s considered ‘dead’ today but was very accurate at that moment or looked to be like it would become great soon.

One can also imagine that with the experience from the past, I’m a bit reluctant with respect to supporting new stuff nowadays, see it as a case of “fool me 10 times, shame on me”, or something like that. At the same time, things change, and if .NET core is the future for both server and desktop, we have to abandon the current .NET framework and its features anyway in the future, so moving is inevitable then. So what’s one more investment?

It’s not a simple investment

It’s not as simple as ‘one more investment, what harm can it do?’. The thing is that with a small ISV as we are, it’s crucial you spend your time on the things that matter: if things fail, it might be fatal to the company. This is different from a team within Microsoft which still get a paycheck after a failed project: they move on to the next project, or even get a chance to rewrite everything from scratch. So from the perspective of a Microsoft employee, it might look like something that might take a month or two and then you’re all set for ‘the future’, and if everything fails, well, we’ll all have a laugh and a beer and move on, right?


When you write software for Microsoft platforms you’ll pick up a thing or two after a while and you’ll begin to see a pattern: Within Microsoft there are a lot of different teams, all trying to get the OK from upper management to keep doing what they’re doing. The numbers are so vast that it’s often the case teams are not really working together but actually against each other, even without knowing it, simply because they have their own agendas and goals and they’re only known within these teams. All these teams produce stuff, new technology to both gain users and the attention of upper management. Some of these technologies stick around and gain traction, others fail and die off. It can be that the decision of one team might affect the future of another, but that’s part of the game: in the end it will all sort out, perhaps both will stay, both will die, upper management will step in and demand the teams talk.

We 3rd party developers look at what’s produced by all these teams and hope to bet on the technologies that stick around. Chances are (see above) that you’re betting on a crippled horse with one lung, and your investment is rendered void after a period of time. It’s therefore crucial to know up front as much as possible before taking the plunge and hope for the best.

With the investment to support .NET Core and ASP.NET vNext in our runtime this isn’t different: I want to know up front why I am doing this, why this is the best investment for my time, time I can’t spend on new features for my customers. I don’t think that’s an unreasonable question.

“Sell me this framework”

So I want Microsoft to sell it to me. Not with PR bullshit and hype nonsense, but with arguments that actually mean something. I want them to sell me their vision of the future, why I have to make the investment they ask from me. “Sell me this pen”, says Jordan Belfort in ‘The Wolf of Wall Street’, while holding up a basic ballpoint pen. It’s from one the many brilliant scenes in that wonderful movie which shows how hard it actually is to sell something which seems trivial but isn’t. Microsoft acts with their communication about .NET core as the room full of sales people in the last scene of The Wolf of Wall Street but they have to act like Brad who picks up the pen and says “I want you to write your name on a napkin” to which Jordan replies “But I don’t have a pen”. “Exactly, buy one?”.

It comes down to which future they mean with ‘.NET Core is the future’, and whose future that is. Will my customers who write desktop applications using Winforms or WPF be part of that future? Or will ASP.NET users only be part of that future? It’s very vague and unclear and with that uncertain. There’s contradicting information provided both through official channels and through unofficial channels (e.g. email) that paints the picture of the Microsoft we have all known for many years: a group of teams all trying to do their best, providing value for what their team stands for and we outsiders have to make sense out of the often contradicting visions ourselves.

My wife said last night: “They don’t want us there, all they want is stuff they control themselves”. I fear she’s right (as always); I have never felt more unwelcome in the world of .NET as today.

Our future

So I decided to make my own future and see where that gets me. This means I’ll spend time I otherwise would spend on a .NET core port on new features for our customers and will take a ‘wait-and-see’-stance with .NET core. After all, our customers had and have confidence in what we provide is solid enough for their future and that’s what matters to me, not necessarily what’s best for Microsoft’s future.

comments 12/9/2014 11:53:00 AM

door: Jan Karel Pieterse
Finally got some time to include an Excel 2007-2013 version of this little tool that traces where you went in your Excel files and gets you back easily!
comments 11/19/2014 7:45:00 AM

door: Jan Karel Pieterse
RefTreeAnalyser 2.0 has just been updated. I have added an option to report all unique formulas in your workbook. Ever had to work out the logic of other people's Excel files? Ever had to untie the spaghetti-knots of a large Excel workbook's formulas? Then you know what a nightmare this can be! Now there is the RefTreeAnalyser! With this tool, finding out how a cell in a workbook derives its results and what other cells depend on the cell is a breeze.
comments 10/23/2014 6:45:00 PM