Include in VSIX property is missing

Tags: , , ,
No Comments »

When using the Visual Studio 2010 SDK and creating VSIX packages, you will set for some files the Include in VSIX property within your DslPackage.

I’m not sure what append, maybe it happed on a project I imported from an older Visual Studio version. But for this project the property I needed so much was just missing.

If the same happens to you, you can edit the DslPackage.csproj and add the following line:

<ProjectTypeGuids>{82b43b9b-a64c-4715-b499-d71e9ca2bd60};     {FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>

It worked on my machine and I got this property back.

Creating a Setup with Visual Studio 2010 for an application that needs the full .net 4.0 framework

Tags: , , , , ,
3 Comments »

When using Visual Studio 2010 or any other version to create msi setups you can define prerequisites that need to be installed before the application will install. For most .NET applications this will be at least the .NET Framework itself. But from version 3.5 on there are two editions of this framework: the Client Profile and the Full Framework.

I created a setup and changed the prerequisites from Client Profile to the Full Framework. When testing this on a virgin machine the setup told me that I need to install the .NET 4.0 framework (that’s right) but then it directs me to the download of the Client Profile only.

To fix this, you have to change the InstallUrl of the .NET Framework Launch Condition:

  • right-click on the setup project in solution explorer and choose View –> Launch Conditions
  • here you will find a Launch Condition for the .NET Framework
  • select this Launch Condition and open the Properties Window
  • there you will find the InstallUrl property

For the full .net 4.0 Framework change this URL to http://go.microsoft.com/fwlink/?LinkId=186913

For the Client Profile of the .net 4.0 Framework the URL can stay as it is: http://go.microsoft.com/fwlink/?LinkId=131000

Deploy TT include files with a VSIX in Visual Studio 2010

Tags: , , , ,
1 Comment »

What’s the problem?

First of all I have to explain to everybody what I’m talking about:

With Visual Studio 2010 Microsoft introduced a new way to deploy extensions to Visual Studio. Instead of creating a setup.exe that copies files, writes to the registry and calls devenv.exe with the /setup option one only needs such a VSIX file that does all that is necessary to install an extension.

When working with the T4 (Template Transformation Toolkit) system one creates tt files with some kind of script that is used to generate source code within a user’s Visual Studio project. The Microsoft DSL Tools do the same: every DSL project contains a number of tt files that will generate all the code. If you take a look at such a file, you will only see two lines:

<#@ Dsl processor="DslDirectiveProcessor" requires="fileName='..\DslDefinition.dsl'" #>
<#@ include file="Dsl\Diagram.tt" #>

The magic happens with the include command. The real template code sits in the included file and not within every single project. When installing the DSL Tools extension to Visual Studio all these include files get installed somewhere where they can be found by the tt engine.

For my own DSL Languages I would like do the same: the user projects should only contain such two line tt files and include the rest of the template from other files.

How to deploy tt template files?

The following steps will show you how to deploy additional files with you DSL extension.

I will start from a newly created Domain-Specific Language Designer project. Such a project contains a Dsl and a DslPackage project where the DslPackage project creates besides of the DslPackage.dll a vsix file to install the designer onto another machine.

Our goal will be to add some tt files to the vsix and make them includable on the installed machine.

  1. Add a folder to the DslPackage project and name it TextTemplates (or any other name)
  2. Add the tt files to this folder.
    • In the properties windows clear the “Custom Tool” property; otherwise Visual Studio will try to execute these templates within the DslPackage project.
    • set “Build Action” to “Content”
    • set “Include in VSIX” to “True”
  3. Add a file called Additional.pkgdef to the DslPackage project with the following content and
    • set “Build Action” to “Content”
    • set “Include in VSIX” to “True”
      [$RootKey$\TextTemplating\IncludeFolders]
      [$RootKey$\TextTemplating\IncludeFolders\.tt]
      "IncludeMyDsl"="$PackageFolder$\TextTemplates"

      Replace “IncludeMyDsl” by a unique name for your DSL but make sure it starts with the word “Include”.

  4. Edit the source.extension.tt in the DslPackage project and add the following line within the Content tag:

    <VsPackage>Additional.pkgdef</VsPackage>

With these changes the tt files from the TextTemplates folder get packaged into the vsix file and will be accessible using the <include> tag on the machine where this extension is installed.

Be careful when using GetCallingAssembly() and always use the release build for testing

Tags: , , ,
1 Comment »

This looks like such a innocent method but it lead to big trouble in one of my projects. But lets start with when someone would use this method that is declared as a static method in the Assembly class. In the MSDN you can read:

Returns the Assembly of the method that invoked the currently executing method.

In our project we had an assembly with a lot of helper methods. On of these gets resources from the calling assembly. In various places of our code we called this method to get icons or other resources. This method used exactly this GetCallingAssembly() method to figure out what assembly to look for resources.

That worked pretty good in debug mode but exceptions were thrown in release mode. We could not understand what is going on. It became even worse: when we build a release version and tried to debug that version (using Visual Studio Debugger) in worked again. It looked like a heisenbug.

It took us some time to figure out what is also written in MSDN:

If the method that calls the GetCallingAssembly method is expanded inline by the compiler (that is, if the compiler inserts the function body into the emitted Microsoft intermediate language (MSIL), rather than emitting a function call), then the assembly returned by the GetCallingAssembly method is the assembly containing the inline code. This might be different from the assembly that contains the original method. To ensure that a method that calls the GetCallingAssembly method is not inlined by the compiler, you can apply the MethodImplAttribute attribute with MethodImplOptions.NoInlining.

The JIT compiler moves code around to optimize for performance. Small methods (up to about 56 Byte IL-Code if I remember it right) can be inlined where the method call was before. But the compiler does this only in release, not in debug mode. Also when attaching the debugger to our release build the JIT compiler stopped inlining to enable debugging and our bug was gone.

After understanding this, the fix is easy. Just don’t allow the compiler to inline that particular method that calls Assembly.GetCallingAssembly(). Then the method stays in the assembly where the source code is written and everything will be fine.

[MethodImplAttribute(MethodImplOptions.NoInlining)]
public void SomeFunction(int i)
{
    // ...
    var a = Assembly.GetCallingAssembly();
    // ...
}

This attribute does the trick and I recommend to use it on all methods that call GetCallingAssembly() and can be called form another assembly and need the real calling assembly.

A Change event for Dependency Properties

Tags: , , ,
No Comments »

WPF comes with Dependency Properties and everybody using WPF will know about these new kind of properties. When you define your own Dependency Properties in your own class you can pretty easy add a property change event handler:

public int MyProperty
{
    get { return (int)GetValue(MyPropertyProperty); }
    set { SetValue(MyPropertyProperty, value); }
}
public static readonly DependencyProperty MyPropertyProperty =
    DependencyProperty.Register("MyProperty",
                        typeof(int),
                        typeof(MainWindow),
                        new UIPropertyMetadata(0,
                            MyPropertyvalueChangeCallback));

private static void MyPropertyvalueChangeCallback
                        (DependencyObject d,
                         DependencyPropertyChangedEventArgs e)
{
}

But how to add such a event handler to an already existing Dependency Property or somewhere else then in the defining class? E.g. to an property of a WPF-Control that was not build by you. Take a standard WPF-TextBox; both the Text and the FontSize properties are Dependency Properties but the TextBox-class only provides a change event for the Text-property. Nevertheless you can get a change event for any Dependency Property:

DependencyPropertyDescriptor dpd =
    DependencyPropertyDescriptor.FromProperty
        (Control.FontSizeProperty, typeof (TextBox));
dpd.AddValueChanged(someTextBox, SomeTextBoxFontSizeChanged);

Every time the FontSizeProperty on the instance someTextBox changes the given method is called. It’s that easy and you can implement this code everywhere not only within the class that defines the property.

Foxit Reader vs. Acrobat Reader: the default program problem

Tags: , ,
No Comments »

I’m a big fan of the Foxit pdf Reader. It is just so damn fast in comparison to Acrobat Reader when opening pdf documents. Yesterday I had to go through a bunch of pdfs each containing only one or two pages to find a specific one. While waiting for Acrobat Reader to open a file I could open, check an close a few files with Foxit. If you are not using Foxit Reader give it a try…

But for some pdf features you will need the original Adobe Reader. E.g. some documents with fields to enter data, with scripts or DRM secured files won’t open in Foxit.

I always have both installed on my machine and Foxit is the default application for opening pdf documents. But this setting changed some time ago on my Windows 7 64 bit machine.

I reset the default program for pdf in windows control panel to Foxit and all icons of pdf documents changed to the Foxit one. I can double click a file and Foxit opens  – great. But in the very moment Foxit comes up the default program is set to Acrobat Reader again. This happens all the time.

I’m not sure what’s going on here and what is involved. I was blaming Acrobat and searched for a solution on the Acrobat side of the problem. Nothing.

In the end an update of Foxit to the current version fixes my problems.

Don’t respect .inf-files too much

Tags: , ,
No Comments »

When I get a new laptop that always triggers a little cascade in my home network. I have three machines: An older laptop serves as a server, a quadcore desktop pc provides gaming power and the a newer laptop accompanies me wherever I go.

Some day ago I replaced the old server laptop (with WinXP) by a newer one, now running Win7.  My printer (Brother DCP-120C) is a little aged now and it’s not a network printer. So I attached it to the server (via USB) and thus it’s accessible in the entire network.

Attaching the printer to Win7 worked flawlessly. Importing the printer (which involves copying the drivers) to another Win7-machine was not a problem either.

But my quadcore still runs WinXP and wouldn’t accept the drivers from the Win7-server. Instead a dialog asks me to provide the appropriate driver manually.

Okay, so I downloaded the latest DCP-120C-drivers for WinXP, unzipped them and guided XP to the .inf-files. XP said: “No, I can’t see the right drivers.”

Unfortunately there is no way to force WinXP to use certain printer-drivers. So I spend ten minutes playing the old yes-no-game before I took a look at the .inf-file. I have never dealt with .inf-files a lot but I noticed this line:


"Brother DCP-120C Printer"   = BRDP120C.PPD, BrotherDCP-120C65FA

Hmm… okay… it is there. My Brother DCP-120C Printer. Fine. Why doesn’t WinXP recognize it? I found another .inf-file in the driver-package with these line:


"Brother DCP-120C USB Printer"   = BRDP120C.PPD…

Hmm… now the name is DCP-120C USB Printer. Do the funny name string make any dfference? I took a closer look at the printer device that was recognized by my server:

image

Okay, nice, there is no “Printer” in the name… can it be… well… I copied the line in the .inf-file and replaced “Printer” by “USB”:

"Brother DCP-120C Printer"   = BRDP120C.PPD, BrotherDCP-120C65FA
"Brother DCP-120C USB"   = BRDP120C.PPD, BrotherDCP-120C65FA

Bingo! That did the trick.

When accessing printers installed on other network computers, the printer drivers really seem to be recognized by their name. That hurts a bit, but knowing it may help solving driver problem from time to time.

VMWare Workstation demands Administrator rights – even if you have those

Tags: ,
No Comments »

Whenever I move my system to a new laptop I create a virtual machine from my old system. Sometimes it is easier to access stuff I forget to copy from a running system than from mounted backup devices.

To create virtual machines from a running system I use VMWare Workstation, which outperforms VirtualPC by far (imho).

Some days ago I tried to create a VM from a Win7/64 system. I was quiet surprised when VMWare Workstation told me, I needed Administrator rights:

image

 

So what – I have administrative rights. But the Workstation didn’t believe me. Since that held me from moving to my new laptop I was really p*ssed.

Luckily some smart guy found this solution.

I think this is clearly a bug, but so far VmWare Workstation demand Administrator-rights in the literal sense: It only accepts, when you are logged in as THE Administrator:

  • start c:\Windows\system32\cmd.exe from explorer using "execute as administrator".
  • type net user Administrator /active:yes
  • logoff & reboot
  • login as Administrator
  • create your image
  • Complain at VMWare.

Acronis True Image Home 2010 hangs with Windows 7

Tags: , , ,
No Comments »

I am a great fan of Acronis True Image. I am using it for a long time and it has saved my skin more than once. So I was glad to hear that the new version (True Image 2010) was designed to work with Win 7. And so it did.

Last week I got a new laptop and today I tried to backup my fresh installation ov Win7/64. Each time True Image caused my entire system to freeze after five seconds of backing up. No reaction at all, the only way out way a hard button-press reset. I tried different options without results. (Which made me panic, because on what else can you rely when not on your backup tool?) –

Fortunately I found the solution here.

There is a new driver for hdd-image snapshots. Installed it, True Image 2010 works again.

Phew!

Shell-Lib reloaded: Another OS, another bug.

Tags: , , ,
No Comments »

Some time ago we blogged about a nifty little .net assembly that enables you to access the basic Windows shell operations in shell32.dll . You may want to do so especially in Win7 because your file operations integrate with the fancy Win7 dialogs, the flashing progress bar etc.

The problem we blogged about in the former post was a missing "Pack"-instruction in the marshalling-description of a structure. I guess we didn’t test it with WinXP64… but some days ago we tests it with Win7/64 and it failed. We figured out that we introduced a new bug that occurred on 64 bit machines only. After some more testing we found out that:

  • The original code (without Pack-value) crashes on 32 bit machines (as described in the former post).
  • The original code (without Pack-value) works fine on 64 bit machines (but we don’t think the original developer had 64 bits 7 years ago 😉 ).
  • Our fixed code (with Pack=2) works fine on 32 bit machines.
  • Our fixed code (with Pack=2) crashes on 64 bit machines.

It seems that the "SHFILEOPSTRUCT" structure in shell32.dll has different layouts in Win7/64 and Win7/32. I guess the guys at MS aim for performance and tell the compiler to optimize. The compiler optimizes by assuming the optimal Pack-value, which usually is the byte-width of the processor; 4 for 32bit, 8 for 64bit.

What effect does the Pack-value have at all?

The "Pack"-value controls the alignment of the starting addresses of each single member of a structure (often called "offset").

Let’s say you have this struct:

struct MammaMia
{
    public int16 sweet16;
    public byte bite;
    public string tanga;
}

Usually you don’t care for the exact way your data is stored in memory, but when you pass it from .NET to the Win-API you have to make sure Win-API receives and returns exactly the right format. This process is called "Marshalling".

One crucial instruction for doing so is the StructLayout attribute.

[StructLayout(LayoutKind.Sequential, Pack = 1)]
struct MammaMia
{
    public int16 sweet16;
    public byte bite;
    public string tanga;
}

(Note: There are a lot more options for the StructLayout attribute and a lot more attributes that help you at marshalling.)

LayoutKind.Sequential means that your data is represented in memory in the same order as you declared it: First "sweet16", than "bite" and then "tanga".

Pack=1 tells .NET that every byte can be the starting point of the next member. So .NET produces this structure in memory:

Pack=1  
byte data
0 sweet16
1 sweet16
2 bite
3 tanga
4 tanga
… …
20 last character of tanga

 

 

Now Pack=2 makes sure that only every 2nd byte can be the starting point of the next member. That leads – of course – to unused bytes in memory:

Pack=2  
byte data
0 sweet16
1 sweet16
2 bite
3 ??? (unused)
4 tanga
5 tanga
… …
21 last character of tanga

 

 

The higher the Pack-value, the more unused bytes you get:

Pack=4  
byte data
0 sweet16
1 sweet16
2 ??? (unused)
3 ??? (unused)
4 bite
5 ??? (unused)
6 ??? (unused)
7 ??? (unused)
8 tanga
9 tanga
… …
25 last character of tanga

 

Is this a waste of memory? Sure. But processors can access those addresses a bit faster than other addresses. So it’s an performance/memory tradeoff.

After some more testing, I figured out that the correct Pack-value for Win7/64 is 8, while the correct Pack-value for Win7/32 is 2 (see former post). Why 2? I’d expect 4 here. I don’t know and Microsoft wouldn’t provide the source code I guess.
Maybe in former Windows version they preferred a middle-course between performance and saving memory.

The simple solution

Now it was obvious how to make die ShellLib-assembly work with 32 and 64bit: Check what machine we run on and use a different marshalling. Since I had to declare different structs here, the major code is copying data back and forth, not very interesting.

The only interesting part here is: How do you check on what kind of machine you run? I was looking for some property at System.Environment, but didn’t find anything. Then Andreas pointed me to this post and the solution is too simple to cross one’s mind:

“Just do a IntPtr.Size (static method) and since IntPtr is designed to be an integer whose size is platform specific, it will be 8 if you are on a 64-bit machine and 4 if you are on a 32-bit machine.“

You can find more information and our fixed code soon at the codeproject.com .

Epilog

You can also do the marshalling all by yourself, starting with memory allocation and ending with freeing memory, just like in the old times. But this is another story and will be told later.

Maybe.

WP Theme & Icons by N.Design Studio
Entries RSS Comments RSS Log in