Quantcast
Channel: SQL Server Data Tools Team Blog
Viewing all 111 articles
Browse latest View live

Known issue: SSDT install fails validating certificates

$
0
0

We are aware of a new issue today where installing SSDT will fail with the following message "A required certificate is not within its validity period when verifying against the current system clock or the timestamp in the signed file."  This is due to an issue in the installer verification code with a SHA256 certificate that expired on 10/6/2013 and we are actively working to update the installer.  There are two workarounds:

1)  You can install the SQL Server 2012 SP1 components separately and the rerun the SSDTSetup.exe installer.  The components necessary are SharedManagementObjects.msi (x86 and x64), SQLDOM.msi (x86 and x64) , TSQLLanguageService.msi (x64), SqlLocalDB.msi (x64), SQLSysClrTypes.msi (x86 and x64), and SSDTDBSvcExternals.msi (located here).

2)  You can set your clock back to before 10/6 and then run the SSDTSetup.exe installer

User should not be affected if they are doing an update from a previously installed SSDT as the SQL Server 2012 SP1 MSIs should already be present on the machine. 


Updated SQL Server Data Tools October 2013

$
0
0

The October 2013 release contains an updated installer that does not fail due to certificate validation of the chained MSIs.  This release also contains a code change that will allow graceful handling of SQL Server 2014 CTP2 connections where not supported. 

Get it here: http://msdn.microsoft.com/en-us/data/hh297027

 

Contact Us

If you have any questions or feedback, please visit our forum or Microsoft Connect page
We look forward to hearing from you.

SQL Server Data Tools for SQL Server 2014 CTP2 is available

$
0
0

We’d like to announce the availability of SSDT that supports SQL Server 2014 databases.  SQL Server Data Tools for SQL Server 2014 CTP2 is pre-release software, does not support upgrade from an existing SSDT install or upgrading to any new versions of SSDT to be released, and should be installed only on a clean machine.  This toolset should not be used against production SQL Server databases.

We have started implementing support for new SQL Server 2014 objects, but we have a few known issues:

  • Modifying a Memory Optimized table will drop and recreate the table during deployment without moving data.  This could result in data loss.  We suggest moving data to a temporary table before deploying changes through SSDT. 
  • There are instances where running Extended TSQL verification will cause Visual Studio to close when publishing a database project.  We recommend turning off the Extended TSQL verification option in the database project properties.
  • Default constraints on a Memory Optimized table are not recognized by SSDT when working in the SQL Server object browser or initially publishing a project.  If you have them in your project and do additional publish, SSDT will try to alter the table to add the default constraint which is not supported on memory optimized tables.
  • The table designer design pane will not support all new properties.  Users can work around this by typing the syntax into the script pane.
  • The data compression option on clustered column store indexes is not supported.

We appreciate all feedback and encourage use of our forum or Microsoft Connect site. We would especially appreciate any feedback on the use of our tools against SQL14 databases and the new features.  If there is anything that is not working as expected or producing incorrect results, please let us know so we can address high priority issues before RTM. 

Run or download SSDTSetup.exe from http://www.microsoft.com/en-us/download/details.aspx?id=40850.  We will not be providing ISO images for this release.  Alternatively, you can create an administrative install using the instructions on the download page.

 

SQL Server 2014 CTP2 only supports the following languages:  English, German, Japanese, Spanish, and Chinese (Traditional).

 

DacFx Public Model Tutorial

$
0
0

Recently there has been an increasing interest in extending the capabilities of SSDT and DacFx. The walkthrough guides for creating new build and deployment contributors and database unit test conditions are a useful start in exploring the tools, but they only scratch the surface of what’s possible. They also don’t really show the best practices for developers when building and debugging extensions such as deployment contributors. In this article we’ll fix that by covering the key concepts behind DacFx extensions, solve real customer issues and highlight best practices. All of the code in this tutorial is available at https://github.com/Microsoft/DACExtensions.

What is the public model?

The key to most extensibility is the public model API. Dacpacs and SSDT projects both model a database’s schema. The public model API lets you access that model programmatically. You can load, query and manipulate the schema to do whatever you’d like. Most scenarios will rely on some level of querying the model and examining the objects that describe the database.

The public model API is loosely typed: the TSqlModel contains loosely typed TSqlObjects that represent all the elements in your schema. Each object will have some Properties that describe its state, and Relationships to other objects in the model. Whether the object you’re looking at is a Table, View, Index or anything else, they’re all represented by the one TSqlObject class.

Of course if everything is a TSqlObject, how can you tell Tables and Views apart? How can you even know what properties and relationships a Table has? That’s where the strongly-typed metadata classes come in. The majority of classes in the model API are actually metadata classes – you’ll see Table, View, etc. Each class has a number of fields that list the Properties and Relationships for that type of object. To lookup Tables in the model you pass in the Table.TypeClass to GetObjects, and then only tables are returned. To get the Columns for a table, you ask for relationships and pass in the Columns relationship class. If all this seems complicated, the code examples should make it clearer. The important thing to note is that you’ll pass in these metadata descriptions whenever you query the model.

Scenario: using the public model API

OK, let’s get started with using the public model! We’ll show the basics of loading, reading, adding to and saving the model. All the code in this example is in the SampleConsoleApp\ModelEndToEnd.cs sample, and can be run by specifying “RunEndToEnd” when running the SampleConsoleApp application.

Loading a model

Loading a model is really simple – either point to the location of an existing Dacpac, or create an empty model and add scripts to it.

 1:// Load from a dacpac
 2:using (TSqlModel modelFromDacpac = new TSqlModel("mydb.dacpac"))
 3: {        
 4:// Note: models are disposable, always have a “using” statement or
 5:// some other way of disposing them
 6: }
 7:  
 8:// Creating a new SQL Server 2012 model and adding some scripts to it
 9:using (TSqlModel model = 
 10:new TSqlModel(SqlServerVersion.Sql110, new TSqlModelOptions { }))
 11: { 
 12:string[] scripts = new[]
 13:     {
 14:"CREATE TABLE t1 (c1 NVARCHAR(30) NOT NULL)",
 15:"CREATE TABLE t2 (c2 INT NOT NULL)"
 16:     };
 17:foreach (string script in scripts)
 18:     {
 19:         model.AddObjects(script);
 20:     }    
 21: }

Notes:

All the database options that you can specify in an SSDT project can be defined using TSqlModelOptions when creating a new model.

The samples show how to copy options from an existing model to a new model if you need to do this.

When adding objects to the model there are certain properties such as how they treat ANSI nulls and QuotedIdentifiers that can be defined.

Reading a Table, its properties and relationships

Reading top level types such as Tables and Views is easy, as shown below. Top level types are any type that could be defined independently in TSQL – Tables, Views, but also things like a Primary Key Constraint since this can be specified in an ALTER TABLE statement.

 1:privatestaticvoid ReadTheModel(TSqlModel model)
 2: {
 3:// This will get all tables. Note the use of Table.TypeClass!
 4:     var tables = model.GetObjects(DacQueryScopes.Default, Table.TypeClass).ToList();
 5:  
 6:// Look up a specific table by ID. Note that if no schema is defined when creating 
 7:// an element the default "dbo" schema is used
 8:     var t1 = model.GetObjects(Table.TypeClass, 
 9:new ObjectIdentifier("dbo", "t1"), DacQueryScopes.Default).FirstOrDefault();
 10:  
 11:// Get a the column referenced by this table, and query its length 
 12:     TSqlObject column = t1.GetReferenced(Table.Columns)
 13:             .First(col => col.Name.Parts[2].Equals("c1"));
 14:
 15:int columnLength = column.GetProperty<int>(Column.Length);
 16:     Console.WriteLine("Column c1 has length {0}", columnLength); 
 17:  
 18:
 19:// Verify the ColumnType of this column. This can help indicate which 
 20:// properties will return meaningful values.
 21:// For instance since Column.Collation is only available on a simple column,
 22:// and Column.Persisted is only on computed columns
 23:     ColumnType columnType = column.GetMetadata<ColumnType>(Column.ColumnType);
 24:     Console.WriteLine("Column c1 is of type '{0}'", columnType);
 25: }    

Relationships

To examine something like a specific Column for a table, you need to first look up the relevant table and then get referenced columns. Note how the Table.Columns metadata relationship is used to find columns for the table.

Properties

The example below shows how the Column.Length metadata property is used to get the length of a column.

If you know the return type for a given property you can use generics to cast to that type. In the example below “Length” is cast to an int. Properties usually have simple return types, such as int, bool, string. Some int properties actually map to Enum values – for example DataCompressionOption.CompressionLevel maps to the CompressionLevel enumeration – and you can cast directly to that Enum type when getting the property. Note that if the property is not found on that object, a default value for that may be returned instead.

Metadata

Finally, a very small number of types in the model have actual “Metadata” properties. These are useful when a type can actually represent conceptually similar things, where each has different properties. A Column can be a regular column, a computed column or a ColumnSet, and what properties are relevant for the column will vary depending on the ColumnType.

Notes:

DacQueryScopes can be quite important. It specifies what kind of objects you want to search for. Depending on the scope you pass in, different types of objects can be returned:

What are you looking for?

Correct query scope

The objects you defined in this dacpac

UserDefined, All

Built in types (for example SQL data types like nvarchar)

BuiltIn, Default, All

Referenced objects added using composite projects in SSDT (“Same Database” references)

SameDatabase , All

System objects from master.dacpac

System, All

You may notice that “different database” references aren’t on this list. That’s because they’re not really useful for anything other than validating the model, and you can never have a TSqlObject that describes them. The only time you’ll get to see any information about them is when querying what types of things an object references, and there’s a special call with an external query scope that’ll include some information about them.

PublicModelRelationships

GetReferenced is only one of severalmethods to traverse relationships in the model, depending on the type of relationship. See the example below showing how a Table relates to an index and a column differently:

To simplify this a little in the public model, we added GetChildren and GetParent methods. In this case you shouldn’t need to understand which object has a reference to the other, or what the relationship is. It will just return all the objects that are logical children of a Table:

ChildParentRelationships

Some relationships have properties associated with them. For instance the relationship between a table constraint and the columns that it refers to has an Ascending property. These properties are queryable using ModelRelationshipInstance.GetProperty<T>.

 

Saving a dacpac

The public model supports building dacpacs and even updating the model inside an existing dacpac. Unfortunately the API does not fully support everything that an SSDT project supports. This may change in the future, but for now the feature support is as follows:

Feature

Supported?

Refactor log

Yes

Deployment contributors

Yes

Pre / Post deployment script

No

References

No

CLR objects

No

XML Schema Collection

No

 1:// save the model to a new .dacpac. Note that the PackageOptions
 2:// can be used to specify RefactorLog and contributors to include
 3: DacPackageExtensions.BuildPackage(
 4:     dacpacPath,
 5:     model,
 6:new PackageMetadata { Name = "MyPackageName", 
 7:         Description = "This is usually ignored", Version = "1.0" },
 8:new PackageOptions());
 9:  
 10:// You can update the model in a dacpac and save it back.
 11:using (TSqlModel modelFromDacpac = new TSqlModel(dacpacPath))
 12: { 
 13:     modelFromDacpac.AddObjects("CREATE VIEW V1 AS SELECT * FROM T1");
 14:  
 15:using (DacPackage dacPackage = DacPackage.Load(dacpacPath, 
 16:         DacSchemaModelStorageType.Memory,
 17:         FileAccess.ReadWrite))
 18:     {         
 19:         DacPackageExtensions.UpdateModel(dacPackage, modelFromDacpac, null);
 20:     }
 21: }

 

Scenario: filtering developer schemas

A real example raised in the forums was how to filter out objects for specific schemas. For example a user may have a “dev” or “test” schema that is populated with some data used during testing. However these should never be deployed to the production environment. The question is, how can you achieve this without using separate projects for the “dev” and “test” schema elements? Two general solutions come to mind here, each with different benefits and drawbacks. We’ll outline both approaches and show the key code required to solve this problem. For full code examples we recommend going to the samples solution and debugging into the sample application and unit tests. That’s really the best way to learn what’s going on here.

Our sample data

Here’s the sample data we’ll use for this scenario. It’s very simple – just a few schemas, tables and views we want to work with. Our goal is to start with a dacpac that includes all of these schema objects, and ensure that what’s deployed to a database only includes objects in the “prod” schema.

 1:string[] SampleScripts = newstring[]
 2: {
 3:// Prod
 4:"CREATE SCHEMA [prod]",
 5:"CREATE TABLE [prod].[t1] (c1 INT NOT NULL PRIMARY KEY)",
 6:"CREATE VIEW [prod].[v1] AS SELECT c1 FROM [prod].[t1]",
 7:  
 8:// Dev
 9:"CREATE SCHEMA [dev]",
 10:"CREATE TABLE [dev].[t2] (c2 INT NOT NULL PRIMARY KEY)",
 11:"CREATE VIEW [dev].[v2] AS SELECT c2 FROM [dev].[t2]",
 12:  
 13:// Test - include reference to production table to highlight errors 
 14:// if filtering breaks references
 15:"CREATE SCHEMA [test]",
 16:"CREATE VIEW [test].[v3] AS SELECT c1 FROM [prod].[t1]",
 17: };
 18:  
 19:// Create a package containing the sample scripts
 20:string devPackagePath = GetFilePathInCurrentDirectory("dev.dacpac");
 21: var scripts = SampleScripts;
 22:using (TSqlModel model = 
 23:new TSqlModel(SqlServerVersion.Sql110, new TSqlModelOptions()))
 24: {
 25:     AddScriptsToModel(model, scripts);
 26:  
 27:     DacPackageExtensions.BuildPackage(devPackagePath, model, new PackageMetadata());
 28:  
 29:     Console.WriteLine("Objects found in original package: '" + devPackagePath + "'");
 30:     PrintTablesViewsAndSchemas(model);
 31: }
 32:  

 

Solution 1: Filtering the model and creating a new dacpac

The first solution assumes that whenever you build your project, you’d like to output two dacpacs: a “production” dacpac that doesn’t contain the “dev” or “test” schemas, and a “dev” dacpac that contains all objects. “production” would be used when deploying to a production database, while the “dev” dacpac is used during development.

Let’s look at the key steps required to do this. All the code in this example is in the SampleConsoleApp\ModelFilterExample.cs sample, and can be run by specifying “FilterModel” when running the SampleConsoleApp application. There are also unit tests for this in the “SampleTests\TestFiltering.cs” file.

Filtering the model and building a new dacpac

Let’s create a simple “IFilter” interface that takes in a set of TSqlObjects and performs some action. We’ll write a schema filter and apply it to all the objects in our model, then save it to a dacpac. The basic process is as follows:

 1:publicvoid CreateFilteredDacpac(string dacpacPath, string filteredDacpacPath)
 2: {
 3:     DisposableList disposables = new DisposableList();
 4:try
 5:     {
 6:// Load a model from the dacpac.
 7:         TSqlModel model = disposables.Add(    
 8:new TSqlModel(dacpacPath, DacSchemaModelStorageType.Memory));
 9:  
 10:// Filter the objects and copy them to a new model.
 11:         TSqlModel filteredModel = disposables.Add(CreateFilteredModel(model));
 12:  
 13:// Create a new dacpac using the new model. 
 14:         DacPackageExtensions.BuildPackage(    
 15:             filteredDacpacPath, 
 16:             filteredModel, 
 17:new PackageMetadata(), 
 18:new PackageOptions());  
 19:     }
 20:finally
 21:     {
 22:         disposables.Dispose();
 23:     }
 24: }

And the filter works by examining the first part of the TSqlObject.Name property. ObjectIdentifiers describe the name. The internal part of the name always starts with the schema. Even the name describes a reference to an external object (for example to master DB or a different database) the external parts of the name are in a separate property. This makes it easy to write a schema-based filter. Here’s a simplified version (again look at the sample files for a fully fleshed out example):

 1:publicinterface IFilter
 2: {
 3:     IEnumerable<TSqlObject> Filter(IEnumerable<TSqlObject> tSqlObjects);
 4: }
 5:  
 6:publicclass SchemaBasedFilter : IFilter
 7: {
 8:private HashSet<string> _schemaNames;
 9:  
 10:public SchemaBasedFilter(IList<string> schemaNames)
 11:     {
 12:         _schemaNames = new HashSet<string>(schemaNames);
 13:     }
 14:  
 15:public IEnumerable<TSqlObject> Filter(IEnumerable<TSqlObject> tSqlObjects)
 16:     {
 17:// Only return objects that pass the “ShouldInclude” test.
 18:return tSqlObjects.Where(o => ShouldInclude(o));
 19:     }
 20:  
 21:privatebool ShouldInclude(TSqlObject tsqlObject)
 22:     { 
 23:bool found = false;
 24:         ObjectIdentifier id = tsqlObject.Name;
 25:if (id.HasName && id.Parts.Count >= 1)
 26:         { 
 27:string schemaName = id.Parts[0];
 28:             found = _schemaNames.Contains(schemaName, 
 29:                         StringComparer.OrdinalIgnoreCase);
 30:         }
 31:
 32:// If the object had one of the filtered schema names, we exclude it
 33:return !found;
 34:     }
 35: }
 36:  

Finally, there’s the CreateFilteredModel method that reads all objects from the current model and copies only objects that pass the filter into a new model:

 1:// Full ModelFilterer code including class init can be viewed 
 2:// in the samples project.
 3:publicclass ModelFilterer
 4: {
 5:private IList<IFilter> _filters;
 6:  
 7:public TSqlModel CreateFilteredModel(TSqlModel model)
 8:     {
 9:         TSqlModelOptions options = model.CloneModelOptions();
 10:         TSqlModel filteredModel = new TSqlModel(model.Version, options);
 11:  
 12:         IEnumerable<TSqlObject> allObjects = model.GetObjects(QueryScopes);
 13:         IFilter allFilters = new CompositeFilter(_filters);
 14:foreach (TSqlObject tsqlObject in allFilters.Filter(allObjects))
 15:         {
 16:string script;
 17:if (tsqlObject.TryGetScript(out script))
 18:             {
 19:// Some objects such as the DatabaseOptions can't be scripted out.
 20:                 filteredModel.AddObjects(script);
 21:             }
 22:         }
 23:  
 24:return filteredModel;
 25:     }
 26: }

Notes:

The schema name comparison currently uses a simple string comparison. Ideally it would compare based on the SQL Database Collation for the model by using SqlString objects for comparison. This is the kind of feature we may add in future releases, but you could also write this yourself fairly easily.

Updating the model in the existing dacpac

The API also supports updating the model inside and existing dacpac. This might be useful if you have other resources such as pre and post deployment scripts inside a dacpac. The public API doesn’t have support for including these when building a dacpac yet, so the best solution would be to copy the dacpac file and then update the model inside it. There’s a unit test in TestFiltering.cs that shows how this is done. The API call is really simple:

 1:publicvoid UpdateDacpacModelWithFilter(string dacpacPath)
 2: {
 3:     DisposableList disposables = new DisposableList();
 4:  
 5:try
 6:     {
 7:         TSqlModel model = disposables.Add(
 8:new TSqlModel(dacpacPath, DacSchemaModelStorageType.Memory));
 9:         TSqlModel filteredModel = disposables.Add(CreateFilteredModel(model));
 10:  
 11:// Note that the package must be opened in ReadWrite mode – 
 12:// this will fail if this isn't specified
 13:         DacPackage package = disposables.Add(
 14:             DacPackage.Load(dacpacPath, 
 15:                 DacSchemaModelStorageType.Memory, FileAccess.ReadWrite));
 16:         package.UpdateModel(filteredModel, new PackageMetadata());
 17:     }
 18:finally
 19:     {
 20:         disposables.Dispose();
 21:     }
 22: }

Deploying the filtered Dacpac

Deploying a Dacpac is really simple using the DacServices API. DacServices supports publishing Dacpacs, creating Dacpacs from a database, and a number of other useful features. To actually deploy our filtered Dacpac to production (or in this example, to localdb) we’d just do as follows:

 1:privatevoid PublishProductionDacpac(string productionPackagePath)
 2: {
 3:string extractedPackagePath = GetFilePathInCurrentDirectory("extracted.dacpac");
 4:using (DacPackage package = 
 5:         DacPackage.Load(productionPackagePath, DacSchemaModelStorageType.Memory))
 6:     {
 7:         Console.WriteLine("Deploying the production dacpac to 'ProductionDB'");
 8:         DacServices services = 
 9:new DacServices("Server=(localdb)\\v11.0;Integrated Security=true;");
 10:         services.Deploy(package, "ProductionDB");
 11:     }
 12: }

Solution 2: Filtering at deployment time

So filtering objects in a dacpac is one option, but what if you want to avoid the need to create a new dacpac? Isn’t there a way to just change things when you’re actually deploying the dacpac? That’s exactly what we’ll show you next by creating a custom Deployment Plan Modifier contributor that runs during the deployment pipeline. These are covered in a separate walkthrough but this example will show you how to specify the contributors to run at deployment time rather than when building a project.

As usual the full code for this example is in the samples. To see how this example works look at the SamplesTests\TestFiltering.cs unit test class. The “TestFilterPlanWhenPublishing” unit test runs this end to end. In this case a unit test was used since it avoided the need to install the sample to the extensions directory before running the sample code (see Best Practices for more information).

Creating a filtering deployment contributor class

A basic contributor class just requires an Export attribute and to extend the DeploymentPlanModifier class. Here’s a “Hello World” contributor and how to add it to the deployment:

 1: [ExportDeploymentPlanModifier(PlanFilterer.PlanFiltererContributorId, "1.0.0.0")]
 2:publicclass PlanFilterer : DeploymentPlanModifier
 3: {
 4:publicconststring PlanFiltererContributorId = "Public.Dac.Samples.PlanFilterer";
 5:protectedoverridevoid OnExecute(DeploymentPlanContributorContext context)
 6:     {
 7:base.PublishMessage(new ExtensibilityError("Hello world!", Severity.Message));
 8:     }
 9: }
 10:  
 11:publicvoid DeployWithContributor()
 12: {    
 13:// assume the dacpac exists
 14:     DacServices services = 
 15:new DacServices("Server=(localdb)\\v11.0;Integrated Security=true;");
 16:  
 17:string productionDbName = "ProductionDB";
 18:using (DacPackage package = 
 19:         DacPackage.Load(existingPackagePath, DacSchemaModelStorageType.Memory))
 20:     {
 21:// Deploy the dacpac with an additional "filter" contributor.
 22:         DacDeployOptions options = new DacDeployOptions()
 23:         {
 24:             AdditionalDeploymentContributors = PlanFilterer.PlanFiltererContributorId
 25:         };
 26:  
 27:         services.Deploy(
 28:             package, 
 29:             productionDbName, 
 30:             upgradeExisting: true, 
 31:             options: options);
 32:     }
 33: }

Note that this doesn’t cover actual installation of the contributor DLL – that’s covered under the Best Practices section later in the document.

Reading and filtering the deployment plan steps

During deployment a number of different objects are available to a contributor. In this case the Deployment Plan is the most interesting thing. It describes each step in the deployment, and contributors can add new steps and remove or replace existing steps. For this example, what we need is to block any CreateElementSteps that mention the schemas to be filtered. Understanding what step you need to examine might not be immediately obvious – in this case you could probably guess, but sometimes the best thing to do would be writing a dummy contributor that steps through a plan and then debugging a deployment, or writing the step type and contents to a file. That lets you understand the precise types to work with.

Here’s the code that actually filters out steps. We’re reusing the filter code from the 1st scenario since the logic is all the same. The only difference is that each step has only 1 object, so we’ll apply our filter and if no objects are left afterwards, we know that the step should be removed.

 1:private IFilter _filter;
 2:  
 3:protectedoverridevoid OnExecute(DeploymentPlanContributorContext context)
 4: {
 5:// Initialize filter options based on contributor arguments
 6:     InitializeFilter(context.Arguments);
 7:  
 8:     DeploymentStep next = context.PlanHandle.Head;
 9:while (next != null)
 10:     {
 11:         DeploymentStep current = next;
 12:         next = current.Next;
 13:  
 14:         CreateElementStep createStep = current as CreateElementStep;
 15:if (createStep != null&& ShouldFilter(createStep))
 16:         {
 17:base.Remove(context.PlanHandle, createStep);
 18:         }
 19:     }
 20: }
 21:  
 22:privatebool ShouldFilter(CreateElementStep createStep)
 23: {
 24:     TSqlObject createdObject = createStep.SourceElement;
 25:return !_filter.Filter(new[] {createdObject}).Any();
 26: }

Notes:

We’ve skipped a number of steps here, most importantly how the filter is actually initialized. It’s fairly simple code and if you debug through the example you’ll see exactly how this works

While writing this example, we added an “Initialize” method to the IFilter interface. This doesn’t look right on an interface, so in a real-world example we’d probably change this to be an abstract “Filter” class with an empty default implementation of the Initialize method, or simply use a Factory pattern for creating the filters instead.

Configuring the contributor at deployment time, and running the deployment

Now that we’ve written the sample contributor, let’s see how it would be used during deployment:

 1:// Note: deploying to (localdb)\v11.0 here, which is the default LocalDB instance
 2:// for SQL Server 2012. You may have a different instance on your machine, 
 3:// if you run into any problems then look online for LocalDb help
 4: DacServices services = 
 5:new DacServices("Server=(localdb)\\v11.0;Integrated Security=true;");
 6:  
 7:string productionDbName = "ProductionDB";
 8:using (DacPackage package = 
 9:     DacPackage.Load(existingPackagePath, DacSchemaModelStorageType.Memory))
 10: {
 11:// Deploy the dacpac with an additional "filter" contributor.
 12:     DacDeployOptions options = new DacDeployOptions();
 13:     options.AdditionalDeploymentContributors = PlanFilterer.PlanFiltererContributorId;
 14:  
 15:// Specify the filter to use and what arguments it needs. 
 16:// Note that this is a little limited by having to pass string-based arguments.
 17:// This could be worked around by serializing arguments to a file and passing
 18:// the file path to the contributor if you need to do anything advanced.
 19:     options.AdditionalDeploymentContributorArguments =
 20:     PlanFilterer.BuildPlanFiltererArgumentString(
 21:"SchemaBasedFilter", 
 22:new Dictionary<string, string>()
 23:         {
 24:             {"Schema1", "dev"},
 25:             {"Schema2", "test"},
 26:         });
 27:  
 28:// Run the deployment with the options as specified
 29:     services.Deploy(package, 
 30:         productionDbName, 
 31:         upgradeExisting: true, 
 32:         options: options);
 33: }

And that’s that! Now you have the ability to filter by schema when deploying a dacpac.

Building on this example

Follow up scenarios you could try for yourself:

Scenario

Hint (how to do it)

Extract a dacpac from a database and filter out some objects. For example filter out all Users and Logins so that later you could replace them with new ones.

Use the DacServices API to extract the dacpac, then run the ModelFilterer on it with a new “FilterObjectType” filter

Implement a more relaxed “Block on Table Loss” function instead of the current “Block on possible Data Loss”.

This is another real-world example, a team wanted to allow columns to be dropped, but wanted to block the deployment if tables were removed.

Write a DeploymentPlanModifier contributor that looked at the ModelComparisonResult in the deployment context, and block if there are any tables in the list of elements to be dropped. If there are, block deployment by publishing an error message with severity “Error”.

Implement a more relaxed “Drop Objects not in source” option that doesn’t drop elements in a “reserved” schema.

Another real world example.

Note that you will need a recent release of DacFx to make this work as there was a bug in the previous version.

In addition to filtering CreateElementSteps, you would also filter DropElementSteps and AlterElementSteps if they relate to the reserved schema.

Learnings and best practices

Testing deployment contributors

The walkthrough guides discuss how to install a contributor so that your Visual Studio projects can make use of them. That’s great in a way, but it’s really not what you want to use during testing. It’s too cumbersome to copy the DLL each time you run it, and if you actually open Visual Studio to test it, you’ll need to shut it down every time you want to change your contributor code.

The best way to test contributors is to write unit tests and reference your contributor DLL and the DacFx DLLs. To be picked up during deployment the contributor code must be in a DLL file (not and executable), and that must either be under the standard DacFx extensions directory on your machine or else be in the same directory as the “Microsoft.Data.Tools.Schema.Sql.dll” file. If you are writing unit tests, the 2nd option has one really powerful benefit. Unit tests usually copy all referenced DLLs to the same location and that means that if your unit test references the DacFx DLLs and your contributor DLL, you can easily run tests without needing to copy the contributor code into the extensions directory. When the test is run, both will be in the same location and hence the DacFx extension manager will find your contributor.

The deployment plan filtering example uses this approach and it makes it really easy to make changes to the contributor and verify that everything works.

Conclusion

Extending DacFx can help solve common issues that your team runs into. Extensions can be really powerful– the APIs are intended to let you do everything our tools can do internally. We’re not quite there yet, but we’ll be updating the current APIs and adding new ones in the future, so stay tuned!

Hopefully after reading this tutorial you’ll take a chance to think about an issue you’ve had that SSDT/DacFx doesn’t solve for you right now, and if you could solve it yourself. If you’d like to share your solution with others, think about publishing it online or adding it to the samples project http://dacsamples.codeplex.com/.

SQL Server Data Tools Preview update for August 2015

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for August 2015 includes a new unified setup for both Database and Business Intelligence (BI) tools in Visual Studio 2015.

SQL Server Data Tools – One installer for Database and Business Intelligence tools

SQL Server Data Tools has been the brand name for both SQL Server Database and Business Intelligence tools shipping into Visual Studio since 2012. We are pleased to announce that starting with our SQL Server Data Tools Preview for VS2015 all these tools will be included in one installer. In the current release SQL Server Integration Services (SSIS) support has been added, with SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) coming in a future update. This dramatically simplifies how users install and update their SQL Server tools inside Visual Studio. We’re sure you will enjoy trying out the latest SQL Server features in Visual Studio and look forward to hearing your feedback!

New setup interface

For those of you used to updating SQL Server Data Tools you’ll notice there is a new configuration page allowing you to choose whether to install the Business Intelligence products:

SQL Server Database tooling is always installed since it is part of the core Visual Studio experience. For now SSIS tools is the only additional option available, with SSAS and SSRS coming soon.

SQL Server version support across the tools

Database tools will continue to support SQL Server 2005 through to SQL Server 2016, including Azure SQL DB. The Business Intelligence projects continue to have a different support matrix. Please note that during the Preview period support for versions of SQL Server earlier than SQL Server 2016 may be limited. In the SQL Server 2016 RTM time-frame support for earlier SQL Server versions will be improved.

Visual Studio version support across the tools

The single installer for DB + BI tools is only available for Visual Studio 2015. Future Visual Studio versions will also benefit from this single installer, but do note that for Visual Studio 2013 you will need to install SSDT Database tools and SSDT-BI separately.

Get it here:

Download SSDT August Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview of SSDT in Visual Studio 2015 is 14.0.50901
  • The version number for the latest preview of SSDT in Visual Studio 2013 is 12.0.50901
  • The version number for the latest preview of DacFx is 13.0.3050.1
Known issues:
  • SSIS tools require Visual Studio 2015 to be previously installed due to a known issue in the Visual Studio Isolated Shell installer. This issue will be fixed as a part of Visual Studio Update 1.
  • SSIS tools may not support Windows 7, Windows Server 2008 R2 or lower version of Windows.
Contact Us
If you have any questions or feedback, please visit our forum or Microsoft Connect page.  We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools Preview update for September 2015

$
0
0

Today we announce the latest preview of SQL Server Data Tools (SSDT). The September preview update provides new feature support in SQL Server 2016 CTP2.4, Azure SQL Database and multiple languages in Visual Studio 2015.

Who can benefit from using the Preview?

  • SQL Server 2016 CTP2.4 database developers who would like to try the new CTP features.
  • Azure SQL Database developers with the latest GA and Preview features.
  • Database developers who would like to install Chinese, French, Italian, Japanese, Korean, Portuguese, Russian or Spanish version of SSDT in Visual Studio 2015.

Get it here:

Download SSDT September 2015 Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview in Visual Studio 2015 is 14.0.50927
  • The version number for the latest preview in Visual Studio 2013 is 12.0.50927

Download Data-Tier Application Framework September 2015 Preview

  • The version number for the latest preview of DacFx is 13.0.3086.1

What’s New in SSDT and DacFx?

  • New feature support in SQL Server 2016 CTP2.4
  • Multiple language support for SSDT in Visual Studio 2015

Contact Us

If you have any questions or feedback, please visit our forum or Microsoft Connect page.  We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools Preview update for October 2015

$
0
0

The SQL Server Data tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for October 2015 adds a support for SSAS and SSRS projects in Visual Studio 2015. SSIS has addressed various issues reported from August and September preview releases. This update includes SQL database project features for SQL Server 2016 CTP3 and Azure SQL Database.

Important!

  • If you are trying out SSMS, SSDT and SQL Server 2016 CTP release on the same machine, please install or upgrade SQL Server 2016 CTP version to CTP3.0 before updating SSDT or SSMS to avoid a known issue on SSMS.

Who can benefit from using the Preview?

  • Analysis Services and Reporting Services developers on Visual Studio 2015 using SQL Server 2008 to 2014 and developers who would like to try the new CTP features SQL Server 2016. Check more detail about Analysis Services and Reporting Services features in this update.
  • Integration Services developers on Visual Studio 2015, experiencing issues from an existing installation of SSDT August and September 2015 preview update, or would like to try SSIS project on Windows 7 OS.
  • SQL Server 2016 CTP3.0 database developers who would like to try the new CTP features.
  • Azure SQL Database developers who would like to try the latest GA and Preview features.

Get it here:

Download SSDT October 2015 Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview in Visual Studio 2015 is 14.0.51026
  • The version number for the latest preview in Visual Studio 2013 is 12.0.51026

Download Data-Tier Application Framework October 2015 Preview

  • The version number for the latest preview of DacFx is 13.0.3124.1

What’s New in SSDT and DacFx?

  • SSDT One-Installer support for SSAS and SSRS.
  • New SSIS features including Hadoop connector, control flow template, relaxed max buffer size of data flow task.
  • Various bug fixes in SSIS and support for Windows 7 OS.
  • SQL Server 2016 CTP3 feature support for database project.

Known issues:

  • SSDT side by side installation with SQL Server 2016 CTP2.4 and SSMS September release.
    • A breaking change was introduced between SQL Server 2016 CTP2.4 and CTP3.0. This may cause SSMS to crash if SSDT Preview October 2015 update is installed on the same machine where SSMS September 2015 update and SQL Server 2016 CTP2.4 are already installed. A workaround is to install SQL Server 2016 CTP3, SSMS October 2015 update first before SSDT Preview October 2015 update to avoid this issue.
  • SSIS side by side installation
    • SSIS developers may encounter an issue when SSDT Preview October 2015 update is installed side by side with SQL Server 2016 CTP 2.4 or older version on the same machine.
    • Debugging package with Data Flow Task may result in "Unable to save to XML" error. A workaround is to set "Run64BitRuntime" SSIS project property to False
  • SSIS Template issues
    • Connection managers binding to template instances can also be listed by other normal tasks or foreach loop containers. A workaround is to ignore the connection manager binding to template instances. Using those connection managers in other executables may lead to a unknown issue.
    • After renaming a template file, packages which contain the template reference cannot be updated automatically. A workaround is to manually edit the .dtsx.designer file to update the template references.
    • After replacing an executable in a template, the template instance in a package will be refreshed, but the connection manager name of the template instance is not prefixed with "(template)". A workaround is to remove the existing template instance and add a new one.
    • If an executable in a template is changed from a task to a container or vice versa, an existing package which contains the template cannot add any executable by double-clicking toolbox. A workaround is to add an executable by drag & drop.
    • When designing a package with template, some layout issue may occur. A workaround is to save a package before next operation after adding a new template instance.

Contact Us

If you have any questions or feedback, please visit our forum or Microsoft Connect page.  We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools Preview update for November 2015

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for November 2015 includes a new connection experience for Microsoft SQL Server and Azure SQL Database. Key improvements are:

  • Easily connect to any database from your history - from Publish, Schema Compare, Data Compare
  • Pin favorite connections for easy access
  • Browse your Azure SQL Databases direct from Visual Studio and simply click to connect
  • Azure firewall rule creation is automatically handled at connection time

Check out the new connection experience in action!

 

The SSDT Preview for November 2015 also includes various enhancements in SSAS and SSIS. Try it out today!

Get it here:

Download SSDT November 2015 Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview in Visual Studio 2015 is 14.0.51128.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.51127.0

Download Data-Tier Application Framework November 2015 Preview.

  •  The version number for the latest preview is 12.0.50914.0

What's new in SSDT?

  • Preview of improved connection experience for SQL Server and Azure SQL Database.
  • SSIS catalog performance improvement: The performance for most SSIS catalog views for non-ssis-admin user is improved.
  • SSAS enhancements in Tabular model upgrade to SQL Server 2016 compatibility mode and adoption of JSON editor for BIM file. See what's new for SQL Server Analysis Services in this post

Known issues:

  • Connection dialog
    • If installing SSDT standalone - without any previous Visual Studio 2015 installation (Community, Pro+, or Express) - Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.
    • In the Connection Properties, changing the Authentication Type will result in the Database Name being reset.
    • In the History tab, long server or database names may result in the Pin to Favorites button being off-screen and requiring a scroll to the right.
  • SQL Server Object Explorer
    • For users using Active Directory Integrated or Active Directory Password connections who are not server administrators, browsing in SSOX is not working.
  • Setup
    • If a version of SQL Server 2016 CTP is already installed on the machine, setup may be blocked. This is to avoid breaking components shared by the engine.
    • If installing SSMS Preview, SSDT Preview for Visual Studio 2015, or SSDT Preview for Visual Studio 2013 side by side on the same machine, there may be issues unless all are updated to the latest release.
    • Creating an Administrative Install (by calling "SSDTSetup.exe /layout") for English language installation will open a dialog to find the file "CommonAzureTools.CAB". This file is located in payload folder created beside SSDTSetup.exe - choosing the file from there will allow administrative install point to be created successfully.
    • (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
      • This will result in no Network connections being shown in the new connection dialog.
      • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
    • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
    • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.

Contact us:

If you have any question of feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

 


SQL Server Data Tools Preview update for December 2015

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for December 2015 includes bug fixes and enhancements for the new connection experience for Microsoft SQL Server and Azure SQL Database which was introduced in the November 2015 update. This update also includes the programmability support for SQL Server 2016 CTP3.2 features and enhancements in SQL Server Analysis Service.

What's new in SSDT?

Connection dialog bug fixes

  • Bug fixes for the recent history listing.
  • Resizing of the connection dialog.
  • Adjustment of the default connection dialog size for long database and server names.
  • Change of test connection timeout value to 15 seconds.
  • Proper use of authentication context set in the connection property when loading a database list. 
  • Allowing creation of Azure SQL server firewall rule if the client IP is not registered when loading a database list.

SQL Server 2016 CTP3.2 feature programmability support

  • System-versioned temporal tables are now supported in database projects

SQL Server Analysis Services enhancements

  • Creation of calculated tables.
  • Setting the default for bi-directional cross filters.
  • Please visit the SSAS team blog to learn more.
SQL Server Integration Services enhancements
  • SSIS Hadoop connector supports Avro file format and Kerberos authentication.

Please note that SSIS designer support for SSIS 2012 and 2014 is not yet included in this update.

Get it here:
  • The version number for the latest preview in Visual Studio 2015 is 14.0.51215.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.51214.0
  • The version number for the latest preview is 13.0.900.80

Known issues:

Connection dialog

  • Properties change through Advanced Properties dialog may not be retained when the connection is opened next time through History tab.
  • If installing SSDT standalone - without any previous Visual Studio 2015 installation (Community, Pro+, or Express) - Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.

Temporal Tables

  • Incremental deployment of a in-memory optimized table may fail if table migration occurs. The mitigation is to manually drop System versioning, i.e. the temporal link between two tables prior to table migration to avoid deployment failure.
  • When you create a new temporal table using Project Item template and add columns to it, deployment may fail. The mitigation is not to add any column until the table is deployed first. Once the table is created, you can add columns to the table and deploy it incrementally.
  • When you delete a column from a temporal table using Table designer, adding a subsequent column may not work properly. To avoid it, keep the comma(,) after the last column in the table definition after deleting a column.

SQL Server Object Explorer
  • For users using Active Directory Integrated or Active Directory Password connections who are not server administrators, browsing in SSOX is not working.
Setup
  • If a version of SQL Server 2016 CTP is already installed on the machine, setup may be blocked. This is to avoid breaking components shared by the engine.
  • If installing SSMS Preview, SSDT Preview for Visual Studio 2015, or SSDT Preview for Visual Studio 2013 side by side on the same machine, there may be issues unless all are updated to the latest release.
  • (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
    • This will result in no Network connections being shown in the new connection dialog.
    • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
  • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
  • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.
Contact us:
If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools Preview update for December 2015

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for December 2015 includes bug fixes and enhancements for the new connection experience for Microsoft SQL Server and Azure SQL Database which was introduced in the November 2015 update. This update also includes the programmability support for SQL Server 2016 CTP3.2 features and enhancements in SQL Server Analysis Service.

What’s new in SSDT?

Connection dialog bug fixes

  • Bug fixes for the recent history listing.
  • Resizing of the connection dialog.
  • Adjustment of the default connection dialog size for long database and server names.
  • Change of test connection timeout value to 15 seconds.
  • Proper use of authentication context set in the connection property when loading a database list. 
  • Allowing creation of Azure SQL server firewall rule if the client IP is not registered when loading a database list.

SQL Server 2016 CTP3.2 feature programmability support

  • System-versioned temporal tables are now supported in database projects

SQL Server Analysis Services enhancements

  • Creation of calculated tables.
  • Setting the default for bi-directional cross filters.
  • Please visit the SSAS team blog to learn more.
SQL Server Integration Services enhancements
  • SSIS Hadoop connector supports Avro file format and Kerberos authentication.

Please note that SSIS designer support for SSIS 2012 and 2014 is not yet included in this update.

Get it here:
  • The version number for the latest preview in Visual Studio 2015 is 14.0.51215.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.51214.0
  • The version number for the latest preview is 13.0.900.80

Known issues:

Connection dialog

  • Properties change through Advanced Properties dialog may not be retained when the connection is opened next time through History tab.
  • If installing SSDT standalone – without any previous Visual Studio 2015 installation (Community, Pro+, or Express) – Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.

Temporal Tables

  • Incremental deployment of a in-memory optimized table may fail if table migration occurs. The mitigation is to manually drop System versioning, i.e. the temporal link between two tables prior to table migration to avoid deployment failure.
  • When you create a new temporal table using Project Item template and add columns to it, deployment may fail. The mitigation is not to add any column until the table is deployed first. Once the table is created, you can add columns to the table and deploy it incrementally.
  • When you delete a column from a temporal table using Table designer, adding a subsequent column may not work properly. To avoid it, keep the comma(,) after the last column in the table definition after deleting a column.
SQL Server Object Explorer
  • For users using Active Directory Integrated or Active Directory Password connections who are not server administrators, browsing in SSOX is not working.

Setup

  • If a version of SQL Server 2016 CTP is already installed on the machine, setup may be blocked. This is to avoid breaking components shared by the engine.
  • If installing SSMS Preview, SSDT Preview for Visual Studio 2015, or SSDT Preview for Visual Studio 2013 side by side on the same machine, there may be issues unless all are updated to the latest release.
  • (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
    • This will result in no Network connections being shown in the new connection dialog.
    • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
  • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
  • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.
Contact us:
If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools Preview update for January 2016

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for Jan 2016 added support for the latest updates of SQL Server 2016 CTP3.3, more feature enhancements in SSAS and SSIS and various bug fixes.

*Please note that SSIS designer support for SSIS 2012 and 2014 is not yet included in this update.

What’s new in SSDT?

SQL Server 2016 CTP3.3 database programmability support

SQL Server Analysis Services enhancements

  • Support for calculated columns and row level security for DirectQuery mode.
  • Support for translations of models.
  • Execute TMSL scripts in the SSIS Analysis Services Execute DDL task.
  • many bug fixes
  • For more details on Analysis Service in CTP3.3, see this blog post.
SQL Server Integration Services enhancements
  • Support for ODBC source and destination component.
  • Support for CDC control task.
  • Support for source and splitter component.
  • Support for Microsoft Connector for SAP BW, Integration Services Feature Pack for Azure.

Get it here:

  • The version number for the latest preview in Visual Studio 2015 is 14.0.60203.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.60202.0
  • The version number for the latest preview is 13.0.3213.1

Known issues:

Connection dialog

  • Properties change through Advanced Properties dialog may not be retained when the connection is opened next time through History tab.
  • If installing SSDT standalone – without any previous Visual Studio 2015 installation (Community, Pro+, or Express) – Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.

Always Encrypted support

  • This feature support is not yet enabled in SSDT.

Temporal Tables

  • Incremental deployment of a in-memory optimized table may fail if table migration occurs. The mitigation is to manually drop System versioning, i.e. the temporal link between two tables prior to table migration to avoid deployment failure.
  • When you create a new temporal table using Project Item template and add columns to it, deployment may fail. The mitigation is not to add any column until the table is deployed first. Once the table is created, you can add columns to the table and deploy it incrementally.
  • When you delete a column from a temporal table using Table designer, adding a subsequent column may not work properly. To avoid it, keep the comma(,) after the last column in the table definition after deleting a column.

Setup

  • If a version of SQL Server 2016 CTP is already installed on the machine, setup may be blocked. This is to avoid breaking components shared by the engine.
  • If installing SSMS Preview, SSDT Preview for Visual Studio 2015, or SSDT Preview for Visual Studio 2013 side by side on the same machine, there may be issues unless all are updated to the latest release.
    (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
    • This will result in no Network connections being shown in the new connection dialog.
    • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
  • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
  • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.
Contact us:
If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

 

SQL Server Data Tools Preview update for Feb 2016

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for Feb 2016 enables SSIS one-designer support for SQL Server 2012 and SQL Server 2014 in Visual Studio 2015. This will allow you to develop with a SQL database, AS, RS and IS project for multiple SQL Server versions with one installation of SSDT in Visual Studio 2015.

What’s new in SSDT

SQL Server Integration Services enhancement

  • One-designer support for multi-targetting SQL Server 2016 RC0, SQL Server 2014 and SQL Server 2012!
  • SSIS Hadoop connector supports for ORC format
  • For more information see the Integration Services blog.

SQL Server Analysis Services enhancement

  • Support for display folders for Tabular models
  • Any models created with new SQL Server 2016 compatibility level can be used with SSIS
  • For more information see the Analysis Services blog.

SQL Server 2016 RC0 database programmability support

Get it here:
Download SSDT February 2016 Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview in Visual Studio 2015 is 14.0.60316.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.60315.0

Update 3/18/2016: We have released an updated version of the SSDT February 2016 Preview to correct an issue reported by users that affected the ability to connect to Azure SQL Data Warehouse.  The previous versions, 14.0.60305.0 and 12.0.60304.0, have been replaced by the fixed version at the Download page.

Download Data-Tier Application Framework February 2016 Preview

  • The version number for the latest preview is 13.0.3252.1

Known issues:

SQL Server Integration Services

  • SQL 2016 Project connection manager UI is used when switching to 2014.
  • Project connection manager UI uses 2016 UI instead of 2014 UI if you create the project level connection manager in 2014, switch the target server version to SQL server 2016, then switch it back to SQL server 2014.
  • Script component doesn’t work after switching target server version if script fails to build.
  • Debug in 64 bits mode in SSDT for Visual Studio 2015 doesn’t work if target server version is set to SQL server 2012/2014.
  • Script editor in Visual Studio 2013 may crash when SSDT-BI for Visual Studio 2013 and SSDT for Visual Studio 2015 are installed on the same machine.

Connection dialog

  • Properties change through Advanced Properties dialog may not be retained when the connection is opened next time through History tab.
  • If installing SSDT standalone – without any previous Visual Studio 2015 installation (Community, Pro+, or Express) – Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.

Temporal Tables

  • Incremental deployment of a in-memory optimized table may fail if table migration occurs. The mitigation is to manually drop System versioning, i.e. the temporal link between two tables prior to table migration to avoid deployment failure.
  • When you create a new temporal table using Project Item template and add columns to it, deployment may fail. The mitigation is not to add any column until the table is deployed first. Once the table is created, you can add columns to the table and deploy it incrementally.
  • When you delete a column from a temporal table using Table designer, adding a subsequent column may not work properly. To avoid it, keep the comma(,) after the last column in the table definition after deleting a column.

Setup

  • If a version of SQL Server 2016 CTP is already installed on the machine, setup may be blocked. This is to avoid breaking components shared by the engine.
  • If installing SSMS Preview, SSDT Preview for Visual Studio 2015, or SSDT Preview for Visual Studio 2013 side by side on the same machine, there may be issues unless all are updated to the latest release.
  • (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
    • This will result in no Network connections being shown in the new connection dialog.
    • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
  • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
  • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.

Contact us:
If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

Special update: SSDT Preview Update with SQL Server 2016 RC2 support

$
0
0

This month we are not releasing the standard SSDT monthly refresh on our main download page. We’ll cover why we’re doing this and how to get support for the latest SQL Server 2016 RC2 compatible SSDT download in this post. If you are an SSRS user we recommend you read the whole post, for anyone else you can skip to the bottom for download links.

Who Should download this SSDT Update
SSIS users who target SQL Server 2016, and anyone looking to try new SSAS or SSRS features added in RC1 or RC2 should download this update. For anyone else, we recommend waiting 2-3 weeks until our next update which resolves the known issue described below. Before downloading and installing this update, please read the following issue description first.

Known issue with this update
Report deployment can fail when targeted SQL Server 2014 or earlier

When build or deploy a Reporting Services report project whose TargetServerVersion is set to “SQL Server 2008” or “SQL Server 2008 R2, 2012, or 2014,” you might encounter the following error: “Attribute MustUnderstand was detected in the report. SQL Server 2014 Reporting Services and earlier do not support MustUnderstand. Either reduce the error level to a value less than 2 or change the TargetServerVersion to a version that supports MustUnderstand.”

The issue is that the MustUnderstand attribute, currently only prefixed to the DefaultFontFamily element, is incorrectly perceived as an item that needs to be downgraded instead of removed. This issue will be fixed in the next update of SSDT.

Work around if you want to use SSRS against SQL Server 2014 or lower:
Set the project’s ErrorLevel to 1 or 0 in the project properties to change the error to a warning. Note that a side-effect of lowering the error level is possible automatic deletion of elements incompatible with the TargetServerVersion. For details about error levels and project properties, see Set deployment properties.

Download SSDTSetup.exe for Visual Studio 2015

Download DAC Framework

Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment

$
0
0

This blog is part 4 of Automate Build and Deployment of Azure SQL Database with Continuous Integration and Continuous Deployment.

Content of Tutorial:

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation
Part 2. Automate Building Azure SQL Database with Continuous Integration
Part 3. Create Nuget package for Azure SQL Database
Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment (this page)
Part 5. Use your own Build and Deployment Agent (coming soon)

In part 4, we will play through how to automate release workflow for Azure SQL Database using continuous deployment. To successfully run this part, you must successfully run part 2 first.

In this scenario, I will use the following simplified release workflow.

  1. When a new build is successfully created, the database is automatically deployed to User Acceptance Test (UAT) server so that user acceptance test can be performed.
  2. Production server deployment is pending while UAT is in progress.
  3. When UAT is passed and approver signs off the deployment to PROD, deployment to PROD environment is executed.

In part 4, we will play through

  • How to configure release environment and deployment workflow.
  • How to configure deployment task.
  • How to configure release trigger with continuous deployment.

 


 

Go to RELEASE page on your team project web portal and add new Release Definition by clicking +

menu

Use Empty deployment template

deployment template

Provide a name for the definition.

Rename Default environment to UAT which represents User Acceptance Test environment in our scenario.

Click + Add tasks and add Azure SQL Database Deployment.

add task

First, configure Deploy Azure SQL DACPAC task. It will start with following page.

task config

Click [Manage] for Azure Classic Subscription property. It will open Services control panel.

Click New Service Endpoint >> Azure Classic as shown below.

add service

It will start ADD NEW AZURE CLASSIC CONNECTION page shown below.

Select Certificate Based and provide a name for the connection. e.g MyConnection

Get publish settings file by clicking download link at the bottom of the dialog and open the file. Copy and paste corresponding string values from the file to connection configuration dialog. Id, Name and ManagementCertificate.

Then click OK to create new connection.

<?xml version="1.0" encoding="utf-8"?>
<PublishData>
  …
<Subscription
…
Id="yoursubscriptionid"
Name="yoursubscriptionname"
ManagementCertificate="yourmanagementcertificate " />
…
</PublishData>

add connection

Go back to Release Definition page and select the configured connection from drop down.

connection dropdown

Click … on DACPAC File property.

Link with your build definition. Then select a dacpac file.

link source

select  file

Set Target properties by providing your Azure SQL server name, database name for UAT environment. Let’s name it as MyDatabaseUAT. PROD database name will be MyDatabase.

  • Provide Server Admin Login name.
  • For Password use variable $(password) to secure the password. You don’t want to show your password in clear text here.
  • Set Firewall to AutoDetect.

set target

Go to Configuration section and add variable ‘password’ with value. Don’t forget to lock it.

add variable

Save. Let’s add PROD environment

Clone UAT to add PROD environment.

clone env

Change Database Name of Azure SQL Database Deployment task to MyDatabase in PROD environment.

target for prod

The next step is configuring continuous deployment on this release definition and release workflow.

Go to Triggers page and select Continuous Deployment.

Select your database build definition for [Set trigger on artifact source to] property. When a build is successfully completed, the build process will create a new release workflow item for UAT and PROD environments.

trigger

For release workflow and orchestration, let’s configure the following

  • Start deploying database to UAT automatically whenever a new version of database is successfully built.
  • PROD deployment stand-by with pending state until UAT deployment is successful AND all authorized personnel sign-off for the deployment to PROD environment.

Click Edit icon for each environment and configure UAT.

Configure UAT Trigger to [After release creation]

Trigger config

Configure PROD Trigger to [After successful deployment on another environment] and set Triggering environment to [UAT]

trigger config for prod

On Prod Configuration page, go to Approvals page.

Add Pre-deployment approver and enable email notification.

approver

Save all changes.

Now we can test the end-to-end flow.

Go to Build page and select your build definition. Queue Build…

Upon successful build, Team Services creates a new release for UAT and PROD environments as we defined.

UAT deployment is performed automatically and PROD deployment is pending for approval.

Approve the PROD deployment. It will continue the release workflow and finish the deployment to PROD.

release approve

You have completed automation of deploying Azure SQL Database and you have a fully working environment to develop, build and deploy Azure SQL Database with CI and CD.

In the next part, we will go over a more advanced topic where you can bring your own build and deploy agent instead of using hosted agent. This way you can have more control over the build agent’s resource, build tools version such as DACFx and SQLPackage.exe and target environment etc.

 

Part 3. Create Nuget package for Azure SQL Database

$
0
0

This blog is part 3 of Automate Build and Deployment of Azure SQL Database with Continuous Integration and Continuous Deployment.

Content of Tutorial:

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation
Part 2. Automate Building Azure SQL Database with Continuous Integration
Part 3. Create Nuget package for Azure SQL Database (this page)
Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment
Part 5. Use your own Build and Deployment Agent (coming soon)

Using Visual Studio Team Services you can easily produce a nuget package for your database whenever new build version is created.

In part 3, we will play through

  • How to setup your own internal nuget feed for Azure SQL Database.
  • How to configure nuget packager and publisher tasks for automation.
  • How to nuget install dacpac for your Azure SQL Database.

 


 

Let’s start and create your own internal nuget feed. It can be done with one click.

Go to Marketplace >> Browse Marketplace and install Package Management unless you have installed it already.

market

PACKAGE* section should be added to your web portal. Go to PACKAGE*

package section

Below is a sample screenshot of what you will get after running through this walk through.

nuget package for sql database

First, click + New feed to add a feed definition. Name your feed and Create. It will create a feed in your team project.

create feed

It’s really simple like this.

Let’s configure automation tasks to produce and publish nuget package for Azure SQL Database.

Copy Nuget package source URL to clipboard. We will need this URL shortly.

get bundle

Get the Nuget Credential Provider bundle.

Execute Nuget Package source addition command in command.exe as instructed on the dialog.

add source feed

 

This enables your PC to access the nuget feed.

Let’s configure Nuget Packager and Nuget Publisher tasks.

Go to your build definition on Build page on web portal.

Add two tasks. Nuget Packager and Nuget Publisher

build definition

First, configure Nuget Packager. Two properties are mandatory.

  • Use Build number to version package: Enable
  • Package Folder: Browse and Select your database project folder

nuget packager

Next, Configure Nuget Publisher

  • Feed type: Internal Nuget Feed
  • Internal Feed URL: paste the feed URL from clipboard

nuget publisher

Go to the General page of your build definition and replace Build number format to the following

$(date:yyyy.MM.dd)$(rev:.r)

The build number format is important otherwise Nuget packager task will return an execution error.

build number

Save the build definition.

Go to your database project in Visual Studio. Create a new empty file, rename as *.nuspec and add the file to your database project.

Open the nuspec file and insert following. You can change string values as you like. Make sure the <file src> property is correct.

<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
<metadata>
<id>MyDatabase</id>
<version>1.0.0</version>
<title>MyDatabase dacpac nuget package</title>
<authors>sqldatatools</authors>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>Nuget package demo for MyDatabase</description>
<copyright>@your copyright statement</copyright>
</metadata>
<files>
<file src="..\MyDatabase\bin\Debug\MyDatabase.dacpac" target="content\MyDatabase.dacpac" />
</files>
</package>

Commit and Sync the change. It will trigger a new build including Nuget packager and publisher tasks.

nuget in action

 

Go to Team Project web portal and PACKAGE* section. You should be able see your first nuget package is created in the internal feed.

nuget result

 

One use for this scenario is to share your database and versions using nuget within your project team. For each package, the feed page shows both Package Manager Console command or Windows Command as shown above. Copy and execute to test.

Copy your nuget install command from your feed page and execute it on command window. It will install Mydatabase.dacpac from the nuget package shown below.

nuget install result

 

You have completed automation of nuget packaging your database.

Go to Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment and learn how to orchestrate release workflow using the build artifacts you have produced.

 

 


Part 2. Automate Building Azure SQL Database with Continuous Integration

$
0
0

This blog is part 2 of Automate Build and Deployment of Azure SQL Database with Continuous Integration and Continuous Deployment.

Content of Tutorial:

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation
Part 2. Automate Building Azure SQL Database with Continuous Integration (this page)
Part 3. Create Nuget package for Azure SQL Database
Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment
Part 5. Use your own Build and Deployment Agent (coming soon)

In part 2, we will play through

  • How to configure a build definition for your database project
  • How to enable continuous integration that triggers a new build when source code changes.

 


 

Let’s start by creating a new build definition.

Click Team Explorer >> Builds in Visual Studio.

Add New Build Definition under Build Definitions section.

te-build2

It will open Create new build definition dialog on your team project web portal.

Select Empty and click Next.

build definition dialog

On Create new build definition page, you can simply use default settings. We will look into more in detail about Agent Queue in part 5. Just use Hosted for now.

Click Create button.

create new build definition

After the build definition is created, click Add build step… on the build definition page.

Add two tasks; MSBuild and Copy & Publish Build Artifact.

msbuild add copy and publish add

Frist, configure MSBuild task with the following properties. Other settings are optional values.

  • Project: browse and select *.sln or *.sqlproj file for your database project
  • MSBuild Arguments: insert following argument

/t:build /p:CmdLineInMemoryStorage=True

build task config

Next, configure Copy and Publish Build Artifacts task.

  • Copy Root: Browse and select your database project folder and concatenate /bin/Debug at the end. If you changed the build output path in your sqlproj file, use that path.
  • Contents: *.dacpac
  • Artifact Name: MyDatabase
  • Artifact Type: Server

copy publish config

Save the build definition.

Let’s test the build definition. Queue a new build like below.

queue build

It will start a new build and produce build artifacts and save it for each build.

build in action

You can Download or Explorer build artifacts in Artifacts page after build is complete.

build success

The final step is to configure the condition to trigger a build.

Edit your build definition and click Triggers section.

Enable Continuous Integration (CI). When you check-in a source file with new change to team project, a new build will be triggered.

You can also schedule build as shown below.

build trigger

Go to Visual Studio, make a change to your database project and check-in the change.

It will trigger a new build.

You have completed setting up build automation for your Azure SQL Database.

Using this build automation, you can package dacpac as nuget package or use the build artifacts to orchestrate release workflow with continuous deployment.

To learn about nuget package, see Part 3. Create Nuget package for Azure SQL Database

To learn about release orchestration and automation, see Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment

 

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation

$
0
0

This blog is part 1 of Automate Build and Deployment of Azure SQL Database with Continuous Integration and Continuous Deployment.

Content of Tutorial:

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation (this page)
Part 2. Automate Building Azure SQL Database with Continuous Integration
Part 3. Create Nuget package for Azure SQL Database
Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment
Part 5. Use your own Build and Deployment Agent (coming soon)

If you have Visual Studio 2015, SSDT and Team Project in your Azure subscription, you can go to Part 2.

 


Let’s start.

Download and install Visual Studio 2015 with the latest update.

Download and install SQL Server Data Tools Preview with the latest update.

Create Team Project in Azure if you don’t have one yet.

  • Login to https://portal.azure.com
  • Click + NEW >> Developer Services >> Team Project to create a new team project.
  • Create a Team Services account if you don’t have one.
  • Select Git for Version Control.
  • Enable Pin to dashboard checkbox for easy access.
  • Click Create button.

team project

Once the provisioning is complete, go to your team project on the portal and click Open in Visual Studio.

open in vs

It will start Visual Studio and connect VS to your team project.

Team Explorer will ask you to install Git tools if not installed already and clone repository. Simply follow the instructions in the Visual Studio UI.

clone repo

Let’s create a database project that will be your Azure SQL Database.

Create a new VS solution with SQL Server Database Project by clicking New… under Solutions section in Team Explorer, and selecting SQL Server in the templates list.

Double-click to open the solution.

create new solution1

Go to Solution Explorer >> Project settings.

Change Target platform to Microsoft Azure SQL Database V12 on the database project settings page.

Save the change.

change project target

Click Team Explorer >> Changes.

Click Commit and Sync to check in the database project to both local Git repository and your team project.

commit and sync

Open team project web portal by clicking Team Explorer >> Home >> Project | Web Portal.

This is your main workplace to setup build and deployment automation.

web portal

Go to Code on the main menu. Your database project should be checked in like shown below.

You have completed the one-time software and account setup and created an empty database project that we can play with in this tutorial.

source control

Go to Part 2. to learn how to enable build automation on your database.

Part 2. Automate Building Azure SQL Database with Continuous Integration

Automate Build and Deployment of Azure SQL Database with Continuous Integration and Continuous Deployment

$
0
0

One of key benefits of developing database with SSDT is you can easily integrate Application Lifecycle Management (ALM) practices to database development. You can develop and manage your database in source control such as Git, automate build with continuous integration and orchestrate releases with continuous deployment.

In this blog, we will walk through a series of end-to-end scenarios of continuous integration and deployment. After playing through these scenarios, you will have a complete build and deployment automation setup for your Azure SQL Database from where you can venture into more advanced techniques. For simplicity we will use Visual Studio Team Services and Azure so that you can enable the whole practice in a few minutes without writing any complex script or code.

Part 1. Prerequisite for Azure SQL Database Build and Deployment Automation
Part 2. Automate Building Azure SQL Database with Continuous Integration
Part 3. Create Nuget package for Azure SQL Database
Part 4. Orchestrate Azure SQL Database Release with Continuous Deployment
Part 5. Use your own Build and Deployment Agent (coming soon)

Things that you will be able to do after playing through the tutorials in this blog:

You can develop a database in Visual Studio Team Services and GIT
Git

 

You can build a database with continuous integration
build automation result

 

You can nuget-package a database with build automation
nuget package for sql database

 

You can deploy a database with continuous deployment and orchestrate release workflow
deploy automation

 

Additional Resources

If you want all the benefits of CI & CD but already use an alternative to Visual Studio Team Services, you may want to read Kent Chenery’s (@kentchenery) blog about enabling CI & CD of SQL Server database using DACFx API, JetBrains TeamCity and Octopus Deploy. Go check it out here. It covers many of the same processes but using alternative build & deployment services.

SQL Server Data Tools Preview update for April 2016

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT Preview is now available. The SSDT Preview update for April 2016 added support for the latest updates of SQL Server 2016 RC3, various bug fixes.

 

Get it here:

Download SSDT April 2016 Preview for Visual Studio 2015 and Visual Studio 2013

  • The version number for the latest preview in Visual Studio 2015 is 14.0.60413.0
  • The version number for the latest preview in Visual Studio 2013 is 12.0.60413.0

Download Data-Tier Application Framework April 2016 Preview

  • The version number for the latest preview is 13.0.3293.1

 

What’s new in SSDT?

SQL Server Database

  • Always Encrypted Support: For Databases that contain Always Encrypted columns, SSDT and DacFx allows viewing and editing these databases and publishing from a database project to them. Note that support for altering columns with column encryption present will be coming in a future release.
  • Connection dialog and SQL Server Object Explorer: Multiple fixes and improvements.
    • The Details page listing advanced connection properties was overhauled to show the full connection string in a multi-line box, and to improve support on High DPI machines.
    • We have brought back the traditional error dialog with detailed connection errors. This helps when diagnosing login issues with clearer error messages and a stack trace so that DBAs or CSS can get the information they need to help diagnose your problems.
    • For users with minimal permissions we fixed a number of issues around listing databases in the Connection Dialog and SQL Server Object Explorer, viewing the Security folder, and more.
    • Azure SQL DB performance when expanding the databases node to list all DBs has been improved.
  • SSDT installer:
    • Fixed issue where .Net was being downloaded on uninstall.
    • The installer size is now set correctly on High DPI machines.
    • Removed the version check blocking SSDT installation if a newer SQL Server version is present.
  • Schema Compare: Fixed a performance issue where checking/unchecking multiple items took a long time in Visual Studio.
  • Project System: Support for using LocalDB 2014 as the default debug target on x86 machines, since there is no x86 version of SQL Server 2016.
  • Build and Deployment:
    • Fixed issue where computed columns were not supported on Temporal Tables.
    • The “Execute deployment script in single-user mode” option is ignored when deploying to Azure V12 as this is not supported in cloud scenarios.
  • SqlPackage.exe: Fixed a problem whereby publishing a dacpac using a publish profile did not honor the DoNotDrop and Exclude options selected in the publish profile.

 

Known issues:

SQL Server Analysis Services (The following issues will be fixed in the next release.)

  • (Bug 7347254) Removing the DAX expression for a TablePermission does not remove the TablePermission object from the role.
  • (Bug 7342441) AS model.bim designer is unable to re-open if partial\incomplete DAX expression was saved in model.
  • (Bug 7357060) VS crashes when right clicking on table tabs after renaming connection friendly name in 1200 model.
  • (Bug 7359207) Multi-Dimensional Projects cannot be built from CommandLine. The work-around is to roll back to Dev14 Update 1, or build from within the VS IDE.
  • (Bug 7368553) Roles window hang and disappear when there is a large number of roles in AS Tabular Project.
  • (Bug 7373792) Tabular AS projects hit null reference exception when trying to deploy\process on high DPI (200%) machine.

Connection dialog

  • Properties change through Advanced Properties dialog may not be retained when the connection is opened next time through History tab.
  • If installing SSDT standalone – without any previous Visual Studio 2015 installation (Community, Pro+, or Express) – Azure login and browse functionality will not be available. To fix, install a version of Visual Studio and update the Azure Tools to version 2.7 or higher.

Temporal Tables

  • When you create a new temporal table using Project Item template and add columns to it, deployment may fail. The mitigation is not to add any column until the table is deployed first. Once the table is created, you can add columns to the table and deploy it incrementally.
  • When you delete a column from a temporal table using Table designer, adding a subsequent column may not work properly. To avoid it, keep the comma(,) after the last column in the table definition after deleting a column.

Setup

  • (VS2015) .Net 4.6.1 may not install if there is no version of Visual Studio or the VS Integrated Shell previously installed on the machine. .Net 4.6 will be installed instead.
    • This will result in no Network connections being shown in the new connection dialog.
    • Recommendation is to install .Net 4.6.1 as it contains a number of stability improvements and SQL related fixes.
  • (VS2015) The April 2014 update to Windows 8.1 and Windows Server 2012 R2 Known as KB2919355 is required to install the Visual Studio 2015 Isolated Shell. Please install KB 2919355 before you install SSDT on these operating systems.
  • (VS2015) The SSDT Japanese language installer may fail on the first attempt due to an issue installing the Visual Studio 2015 Japanese language pack. Retrying installation should succeed in installing correctly.

 

Contact us:

If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

SQL Server Data Tools GA update for June 2016

$
0
0

The SQL Server Data Tools team is pleased to announce an update for SSDT General Availability (GA) is now released. The SSDT GA update for June 2016 added support for the latest updates of SQL Server 2016 RTM, various bug fixes.

 

Get it here:

Download SSDT GA June 2016 for Visual Studio 2015 and Visual Studio 2013

Download Data-Tier Application Framework June 2016

  • The version number for GA is 13.0.3314.1

 

What’s new in SSDT?

One Installer for SQL Server database and Business Intelligence (BI) tools

  • New unified setup for both Database and Business Intelligence (BI) tools in Visual Studio 2015.
  • Simple to acquire and integrate SQL Server database, Analysis Services, Reporting Services and Integration Services designer developer experience.

Support for multiple SQL Server versions

Monthly SSDT update release for faster response to customer

  • New and unified SSDT download page.
  • Shipping every month on the web.
  • Faster to address customer feedback every month.
  • Light-up support for new Azure SQL Database features.
  • Automatic update notification.

 

SQL Server Analysis Service Enhancements

Project templates added for Tabular 1200 models in SSDT

  • No longer need two versions of SSDT for building relational and BI projects.
  • Adds project templates for Analysis Services solutions, including Analysis Services Tabular Projects used for building models at the 1200 compatibility level.
  • Includes other Analysis Services project templates for multidimensional and data mining solutions.

Improved DAX formula editing

  • Updates to the formula bar help you write formulas with more ease by differentiating functions, fields and measures using syntax coloring
  • Provides intelligent function and field suggestions and tells you if parts of your DAX expression are wrong using error ‘squiggles’.
  • Allows you to use multiple lines (Alt + Enter) and indentation (Tab).

Improved SSDT modeling performance for Tabular 1200 models

  • For Tabular 1200 models, metadata operations in SSDT are much faster. By comparison, on the same hardware, creating a relationship on a model set to the SQL Server 2014 compatibility level (1103) with 23 tables takes 3 seconds, whereas the same relationship on a model created set to compatibility level 1200 takes just under a second.

Improved DAX formula editing

  • With formula fixup on a Tabular 1200 model, SSDT will automatically update any measures that is referencing a column or table that was renamed.

Support for Visual Studio configuration manager

  • To support multiple environments, like Test and Pre-production environments, Visual Studio allows developers to create multiple project configurations using the configuration manager. Multidimensional models already leverage this but Tabular models do not. With this release, you can now use configuration manager to deploy to different servers.

Set default for bi-directional cross filters in tabular models in SSDT

  • This release enables bi-directional cross filters by default for tabular models at the 1200 compatibility level in SSDT. Filters are only auto-generated when the direction can be established with a high degree of certainty. If there is ambiguity in the form of multiple query paths across table relationships, a filter won’t be created automatically. See Bi-directional cross filters for tabular models in SQL Server 2016 Analysis Services for details.

Calculated tables in SSDT

  • A calculated table is a model-only construction based on a DAX expression or query in SSDT. When deployed in a database, a calculated table is indistinguishable from regular tables.
  • There are several uses for calculated tables, including the creation of new tables to expose an existing table in a specific role. The classic example is a Date table that operates in multiple contexts (order date, ship date, and so forth). By creating a calculated table for a given role, you can now activate a table relationship to facilitate queries or data interaction using the calculated table. Another use for calculated tables is to combine parts of existing tables into an entirely new table that exists only in the model. See Create a Calculated Table (SSAS Tabular) to learn more.

Translations in SSDT

  • You can now store translated metadata in a Tabular 1200 model. Metadata in the model includes fields for Culture, translated captions, and translated descriptions. To add translations, use the Model > Translations command in SQL Server Data Tools. See Translations in Tabular models (Analysis Services) for details.

Tabular Model Scripting Language (TMSL) supported in SSDT

  • Scripts can easily be generated in SSDT for tabular models at compatibility level 1200. Functionally, TMSL is equivalent to the XMLA ASSL extension that provides multidimensional object definitions, except that TMSL uses native descriptors like model, table, and relationship to describe tabular metadata. See Tabular Model Scripting Language (TMSL) Reference for details about the schema.

 

SQL Server Database Project Enhancements

Comprehensive programmability support for all new features of SQL Server 2016

  • Easy to develop, build and deploy SQL Server 2016 database.
  • Security: Always Encrypted, Row-Level security, Dynamic Data Masking, Transparent Data Encryption, Azure Active Directory.
  • Performance and Hyperscale: In-memory OLTP v2 support, Stretch Data Warehouse.
  • Modern RDBMS: Temporal, JSON, Polybase.
  • Visit what’s new in SQL Server 2016 to learn more.

End-to-End support for Database Lifecycle Management (DLM) & DevOps practice

New Connection Experience for Microsoft SQL Server and Azure SQL Database

  • Easily connect to any database from your history – from Publish, Schema Compare, Data Compare.
  • Pin favorite connections for easy access.
  • Browse your Azure SQL Databases direct from Visual Studio and simply click to connect.
  • Azure firewall rule creation is automatically handled at connection time. See https://www.youtube.com/watch?v=VUHk-o8gjpI for details.

 

SQL Server Integration Services Enhancements

SSIS Designer creates and maintains packages for SQL Server 2016, 2014, or 2012

  • Create, maintain, and debug packages that target SQL Server 2016, SQL Server 2014, or SQL Server 2012. In Solution Explorer, right-click on an Integration Services project and select Properties to open the property pages for the project. On the General tab of Configuration Properties, select the TargetServerVersion property, and then choose SQL Server 2016, SQL Server 2014, or SQL Server 2012.

Reuse Control Flow across Packages by Using Control Flow Package Parts

  • Save a commonly used control flow task or container to a standalone part file (a “.dtsxp” file) and reuse it multiple times in one or more packages by using control flow package parts.

Support for OData V3 and V4 data sources

  • For OData V3 protocol, the component supports the ATOM and JSON data formats.
  • For OData V4 protocol, the component supports the JSON data format.

Column names for errors in the data flow

  • When you redirect rows in the data flow that contain errors to an error output, the output contains a numeric identifier for the column in which the error occurred, but does not display the name of the column. Now the name of the column in which the error occurred can be displayed Advanced Editor and Data Viewer.

Support for Hadoop and HDFS

  • The Hadoop Connection Manager now supports both Basic and Kerberos authentication.
  • The HDFS File Source and the HDFS File Destination support both Text and Avro format. HDFS File Source also support ORC format.
  • The Hadoop File System task now supports the “Within Hadoop” option in addition to the “To Hadoop” and the “From Hadoop” options.

AutoAdjustBufferSize property

  • When you set the value of the new AutoAdjustBufferSize property to true, the data flow engine automatically calculates the buffer size for the data flow.

 

SQL Server Reporting Services Enhancements

Connectivity improvements

  • Support for Oracle ODP.NET natively
  • Support for Teradata 14.x
  • Support for SAP BW Session Security
  • Support for Personalized Connection Strings

Tree Map and Sunburst Charts

  • Enhance your reports with Tree Map and Sunburst charts, great ways to display hierarchal data. For more information, see Tree Map and Sunburst Charts in Reporting Services.

Modern paginated reports

  • Design beautifully modern paginated reports with new, modern styles for charts, gauges, maps and other data visualizations.

 

Contact us:

If you have any question or feedback, please visit our forum or Microsoft Connect page. We are fully committed to improve the SSDT experience and look forward to hearing from you!

Viewing all 111 articles
Browse latest View live