Question

Partyz on Tue, 19 Mar 2013 11:00:18


Hi,

In ErikEJ's ExportSqlCe lib I can't find proper index-related method so I'm wondering, what's the best implementation suggested either by its author, its users or both ;-) Or am I missing something?

So far my idea, as all idx-related generate methods needs the idx name, is to loop through the original CE database's tables, catch idxs names and put 'em into enumerate but... how? At this point I think the computational complexity is too heavy as for this purpose but I lack on experience with ErikEJ's lib implementations.

OK. So here is the question:

What's the best way to extract indexes to sql script? As I'd like to put 'em after I'll do some jobs on the tables?

(the reason why I'm first extracting the schema (CREATE TABLE(s)) from SQL CE and recreating 'em on SQL Server Express' side, next bulk-copying data from SQL CE to SQLEx, next extracting primary keys from SQL CE and recreating 'em on SQLEx, next extracting foreign keys from SQL CE and recreating 'em on SQLEx, next I'd like to extract indexes from SQL CE and recreate 'em on SQLEx is that lib's generateTableScript method creates foreign keys just after CREATE TABLE instructions, so, what's obvious, SQL is giving errors as you can't put foreign key to column in table, which is not created yet :) - or again am I missing something?).

Looking forward for quick answers.

Best regards,

Partyz Kaft


Sponsored



Replies

ErikEJ on Tue, 19 Mar 2013 11:34:53


You can script everything in the correct order via my API, see my recent blog post here: http://erikej.blogspot.dk/2013/03/sql-server-compact-code-snippet-of-week_8.html

Partyz on Tue, 19 Mar 2013 11:46:45


As I suppose Scope.SchemaData will write to file all the schema's: structure (create), primaries, foreigns and then indexes? :)

Best regards,

Partyz Kaft





ErikEJ on Tue, 19 Mar 2013 11:58:47


Not sure what you mean by "Bulk Copy" data - do you mean using the SqlBulkCopy API? That should never take 60-90 minutes??

The writing should take seconds, but many script files will be created with a large database, and you must run each in order.

I cannot guess on the execution time, you will have to test that.

Alternatively you can use the script schema only option (Scope.Schema) and spilt the generated script after the create table statements, and take advantage of those 2 scripts in between the bulk copy process. Try that, and have a look at the generated script.

I will be happy to test performance for you if you are willing to share a database (you can contact me by email: e j l s k o v (a t) h o t m a i l . c o m )

ErikEJ on Tue, 19 Mar 2013 12:30:50


SchemaData = everything, Schema = no INSERTs, but everything else, Data = only INSERTs

Partyz on Tue, 19 Mar 2013 13:33:07


Yes, I mean SqlBulkCopy.

But returning to your (excellent! by the way) library.

FYI: 660MB sdf file ~ 137 scripts ~ 2.42GB ~ 26 mins

And now the implementation.

I have this:

using (IRepository ceRepository = new DBRepository(AppNamespace.Properties.Settings.Default.DatabaseCeConnectionString)) { string fileName = C2XMT.Properties.Settings.Default.MigrationScriptsPath + "\\" + AppNamespace.Properties.Settings.Default.CETempFile; var generator = new Generator(ceRepository, fileName); generator.ScriptDatabaseToFile(Scope.SchemaData); using (IRepository serverRepository = new ServerDBRepository(AppNamespace.Properties.Settings.Default.DatabaseExpressConnectionString)) { serverRepository.ExecuteSqlFile(fileName); } }

Nothing happened after all the scripts are created - it appears like the serverRepository.ExecuteSqlFile() did nothing.

But I did that on directory which contains scripts:

            foreach (string file in Directory.GetFiles(AppNamespace.Properties.Settings.Default.MigrationScriptsPath))
            {
                ImportScript(File.ReadAllText(file), AppNamespace.Properties.Settings.Default.DatabaseExpressConnectionString);
            }

and the method:

private void ImportScript(string sqlScript, string sqlConnectionString) { try { int i = 0; SqlConnection sqlConnection = new SqlConnection(sqlConnectionString); IEnumerable<string> commandStrings = Regex.Split(sqlScript, @"^\s*GO\s*$", RegexOptions.Multiline | RegexOptions.IgnoreCase); sqlConnection.Open(); foreach (string commandString in commandStrings) { if (commandString.Trim() != "") { new SqlCommand(commandString, sqlConnection).ExecuteNonQuery(); } i++; } sqlConnection.Close(); } catch (Exception ex) {

//supressed for now return; } }

It is running since 14:19 (MTZ - My TimeZone ;)) - if this will take same time as generation it is acceptable by the client.

SUMMARY

It triples the size of data :) but it seems to be faster than Bulk.

JUST A WHILE AGO

The ALTER TABLE statement conflicted with the FOREIGN KEY constraint 

:)

I'm sure the scripts are executed one by one in order from 0 to the end :)



ErikEJ on Tue, 19 Mar 2013 13:52:48


YOu must ensure the script are executed in order, you could just use this for each file

 using (IRepository serverRepository = new ServerDBRepository(AppNamespace.Properties.Settings.Default.DatabaseExpressConnectionString))
               
{
                    serverRepository
.ExecuteSqlFile(fileName);
               
}
Instead of ImportScript

Yes, the scripts are very verbose...

In order to help you troubleshoot, you have to share a repro....

Partyz on Tue, 19 Mar 2013 14:42:26


I will need to prepare repo containing all crucial logic (schema with ks and idxs) stripped off sensitive data as my client disallowed infoleaks ;-)

For now thank You very much for help, regardless to Your time spent on. 

I didn't get that I can use Scope.Schema - so now I have very pretty schema in which I can insert bulk-copied data without a problem :-)

Will catch up each other as I work out some time for side-by implementations - I dunno the timetable for tomorrow.

Best regards,

Partyz Kaft