Tuesday, 31 July 2012

Entity Framework 4.3 Seeding Data Using int as Identity Column Type

When seeding data using EF 4.3 code first, there is a gotcha when seeding data in identity columns. Let me explain.

In a recent project, I had two tables as part of my model. The first one was called Questions, and the code looked like:

    [Table("Questions", Schema = "Forms")]
    public class Question

        public int QuestionID { get; set; }

        [Required(ErrorMessage = "The question is required")]
        public string QuestionText { get; set; }

        public string Marker { get; set; }


The next table was a relationship table to link Question records to a table with Form records. I had used a linking table as I needed a display order column:

    [Table("FormsQuestions", Schema = "Forms")]
    public class FormsQuestions

            [Key, ForeignKey("Form"), Column(Order = 0)]
            public int FormID { get; set; }
            public virtual Form Form { get; set; }

            [Key, ForeignKey("Question"), Column(Order = 1)]
            public int QuestionID { get; set; }
            public virtual Question Question { get; set; }

            public int DisplaySequence { get; set; }

As you can see, I am using int as my type for the identity column in the Questions table, a fairly common thing. When seeding the data for these two tables, I can set the value of QuestionID in the Questions table..or can I? I ran across this gotcha when I made a mistake with seeding the Question table:

var questions = new List<Question>
                new Question { QuestionID = 1, QuestionText = "Question text one", Marker = "<<one>>"},
                new Question { QuestionID = 3, QuestionText = "Question text two", Marker = "<<two>>"}
            questions.ForEach(s => context.Questions.Add(s));

As you can see, the second record should have a QuestionID with the value 2. However, as I was setting the QuestionID column values (or so I thought), I didn't think this would be a problem. There are a lot more questions being seeded in the actual code, however I've trimmed the data for the example in this post.

If I seed the joining table for questions with QuestionIDs 1 and 2 from above, I have something like:

var formsquestions = new List<FormsQuestions>
                new FormsQuestions { QuestionID = 1, FormID = 1, DisplaySequence = 10},
                new FormsQuestions { QuestionID = 3, FormID = 1, DisplaySequence = 20}
            formsquestions.ForEach(s => context.FormsQuestions.Add(s));

However, this is going to fail! When inserting the data, values 1 & 2 have been used for the question records, not 1 & 3.

It transpires that whatever value I set for the QuestionID column when seeding the data is ignored and this set value is not inserted with the record. Instead, the next available value is assigned and there doesn't seem to be a command in EF that's equivalent to setting IDENTITY_INSERT <table_name> ON.

Special mention must go to my friend Luke Baughan for putting me in the right direction when I ran across this issue! Please read his blog at:

No comments:

Post a Comment