Light Speed User Guide
Light Speed User Guide
Contents
Introduction................................................................................................. 12
What is LightSpeed?.......................................................................................................................... 13 New to LightSpeed? .......................................................................................................................... 14 Old Hand? ......................................................................................................................................... 15 Key Features...................................................................................................................................... 16 About This Book ................................................................................................................................ 17
Validation .......................................................................................................................................... 40
Specifying Validation Criteria ....................................................................................................................... 40 Advanced Validation Options ...................................................................................................................... 41 Automatic Validations.................................................................................................................................. 42 Declarative Validation in Code ..................................................................................................................... 43 Overriding OnValidate ................................................................................................................................. 43 Validation Considerations ............................................................................................................................ 44 Localising Validation Messages .................................................................................................................... 44
Transactions ...................................................................................................................................... 61
Using TransactionScope ............................................................................................................................... 61 Using ADO.NET Transactions ....................................................................................................................... 61 Automatic Transactions ............................................................................................................................... 62
Identity Generation........................................................................................................................... 70
Identity Methods in LightSpeed ................................................................................................................... 70 Setting the Identity Method Globally .......................................................................................................... 70 Overriding the Identity Method on a Per-Entity Basis ................................................................................. 71 KeyTable Identity Generation ...................................................................................................................... 71 Sequence and MultiSequence Identity Generation ..................................................................................... 71
Guid and GuidComb Identity Generation .................................................................................................... 72 IdentityColumn Identity Generation ............................................................................................................ 73 Identity Generation Options ........................................................................................................................ 73 How Block Allocation Methods Work .......................................................................................................... 73
Changing the Designer Defaults...................................................................................................... 147 Customising the Generated Code ................................................................................................... 148
Using Your Own Code Generation Templates ........................................................................................... 148 Extending the Designer Metamodel .......................................................................................................... 149 Using T4 Templates with the LightSpeed Designer.................................................................................... 151
Controlling How Entity Data Loads ................................................................................................. 183 Understanding Named Aggregates ................................................................................................. 185
Visualising Aggregates ............................................................................................................................... 185
User Ids for Entity Tracking and Soft Deletion ................................................................................ 201
Built-In User Identification Strategies ........................................................................................................ 201 Using a Custom Identification Strategy ..................................................................................................... 201
Foreign Keys That Are Part of a Composite Key ......................................................................................... 214 Composite Foreign Keys That Overlap the Primary Key ............................................................................ 214 Many-to-Many Associations Represented As Composite Keys ................................................................. 215
Database Driver Versioning ............................................................................................................ 249 LINQ Support Limitations ................................................................................................................ 250
Unsupported LINQ Operators .................................................................................................................... 250 Comparisons to an Associated Entity ........................................................................................................ 250 CLR Methods in a LINQ Query.................................................................................................................... 251 Joining, Grouping and Combining .............................................................................................................. 251 Database Specific Limitations .................................................................................................................... 252
10
11
Introduction
12
What is LightSpeed?
At its worst business logic can be very complex. Rules and logic describe many different cases and slants of behavior, and its this complexity that objects were designed to work with. A Domain Model creates a web of interconnected objects, where each object represents some meaningful individual, whether as large as a corporation or as small as a single line on an order form. Martin Fowler LightSpeed is a framework that helps you to rapidly build persistent domain models for .NET applications. LightSpeed helps you to define classes and their relationships, and to conveniently load and save entities using a database, avoiding the need to write large amounts of boilerplate data access code. Using LightSpeed, you work at the level of business objects, not at the level of database rows.
13
New to LightSpeed?
Your LightSpeed installation includes the book Getting Started with LightSpeed, which walks you through the process of creating a domain model and using it in a simple application. You can view this using the Mindscape > LightSpeed > Getting Started link on the Start menu. The installation also includes a number of samples which will help you understand how to use specific features of LightSpeed and how to structure applications. You can launch the samples using the Mindscape > LightSpeed > Samples link on the Start menu. We have also posted a number of screencasts on the Mindscape Web site, which demonstrate the basics of LightSpeed and drill into a number of key features.
14
Old Hand?
If youre already experienced with LightSpeed, you can find out whats new and different in this version of LightSpeed from the Mindscape > LightSpeed > Whats New and Release Notes link on the Start menu.
15
Key Features
LightSpeeds design philosophy is centered on the following guiding principles: Convention over configuration. Support idiomatic .NET domain models: validation, data binding, change notification etc. Highly usable API and low barrier to entry. Encapsulate and encourage best practice patterns: session per request, Unit of Work etc. Small, lightweight and fast.
There isnt room here to provide a full list of LightSpeed features, but among the key features are: Domain modelling. LightSpeed supports domain-driven design concepts such as entities and value objects, the Unit of Work pattern and aggregates. Visual model design. You can create models in a visual designer, making it easy to see the relationships between entities and reducing the need for hand coding. Models can be created from an existing database or from scratch using a toolbox. Rapid design iteration. The designer allows quick, non-destructive synchronisation of the database and the model, meaning there are no speed-bumps as you evolve your model. LINQ. You can use the popular LINQ syntax and methods to perform LightSpeed queries, gaining the benefits of code completion, compiler type checking and so on. LightSpeed also provides a query object API for dynamic queries and finer control. Validation. You can specify validation rules at the entity or property level. LightSpeed automatically checks validity before allowing an entity to be saved. Each entity exposes an Errors collection which supports data binding for easy presentation. Eager and lazy loading. You can load an entitys dependencies in the same database query as the entity, avoiding the infamous N+1 problem. You can also define your own eager load graphs for different situations. UI framework support. LightSpeed implements several standard UI integration interfaces for you, including IEditableObject, INotifyPropertyChanged and IDataErrorInfo, making it easy to use entities in data-driven user interfaces. Convention-based mapping. You can just create your classes without having to spend extra effort mapping them to a database schema. Safe, efficient data access. LightSpeed always uses parameters in database statements, avoiding the risk of SQL injection vulnerabilities. Statements are optimised and batched for efficiency. Multiple databases. You can use LightSpeed with Microsoft SQL Server, Oracle, MySQL, PostgreSQL, IBM DB2, SQLite, SQL Server Compact and VistaDB. It also works with the Amazon SimpleDB and Microsoft SQL Azure cloud databases. Database migrations. For manageable and controllable upgrading and downgrading of the database schema. Distributed application support. You can ship an object graph over the wire, work with it on a client and return the changes efficiently to the server without needing to leave the LightSpeed API.
16
You can access these books from the Mindscape > LightSpeed folder on the Start menu.
17
18
19
Once weve defined our entity in this way, we can use LightSpeed to retrieve or persist instances of this entity to and from the database. We can also go on to configure behaviour (e.g. adding validation) and performance (e.g. caching or lazy-loading) as required, and to extend the domain model by adding custom methods. LightSpeed provides a visual designer for creating domain models and defining domain entities. The designer makes it easy to produce both the object and relational representations from a single master model, providing an extremely convenient modelling workflow. You can also define entities purely in code. For a step-by-step introduction to creating domain models and using them in LightSpeed, see the Getting Started guide, which you can find on the Start menu.
20
The designer is initially blank (except for some links for users who want help getting started). You can either create entities in the designer using the Toolbox, or create entities from your existing database tables.
21
Initially, the new entity will be called Entity1. To change this, just type the new name while the entity is selected.
The Visual Studio Properties window shows more options for configuring the entity:
For example, you can also edit the entity name through the Name box in the Properties window. Many of the settings in the Properties window are things youll only need to think about as you start building up your model. For example, youll use the Persistence options if you need to customise the way the entity is stored in the database. This book covers the various options under the relevant chapters. You can also get an idea of what each option does by looking at the description area at the bottom of the Properties window.
22
One option thats important for all models is the Identity Type option. Every entity in LightSpeed has an Id, a unique identifier that allows LightSpeed to tell it apart from other entities of the same type. The default identity type is Int32 the .NET Int32 type, equivalent to the C# int type. If you expect to create a huge number of entities, youll probably want to change this to Int64 (C# long). Some users prefer GUID Ids to numeric Ids, so you can also choose the Guid identity type. (The String identity type is usually used only for natural keys in legacy databases: see Working with Legacy Databases if you think you need string keys.)
To add properties to the entity, right-click it and choose Add New Entity Property.
23
A shortcut for this is to hit the Insert key when the entity (or an existing property of the entity) is selected. You can edit the name of the new property by typing:
The default data type for newly created properties is String. Use the Properties grid to change this if necessary:
You can also enter the type in front of the property name, just like declaring a C# field:
When you save your model, LightSpeed generates .NET classes for each of the entities you have defined. These classes are stored in a generated code file with the same name as the model file and a .cs or .vb file extension. You can open this file in Visual Studio to view the entity class code, but you cannot edit the file if you do, your changes will be overwritten next time LightSpeed regenerates the code. Always make changes through the designer, never in the generated code file.
24
The generated entity classes are partial classes. This means you can extend them through your own partial class files, for example to add domain methods. Again, always do this through a partial class file, never by editing the generated code.
25
(You only need to do this once LightSpeed remembers the settings for future updates.) Now, if you right-click on the model background, you will see an Update Database option:
Click on this and LightSpeed will compare your model to the database and display a list of changes that need to be made to the database. (If the database doesnt already exist, LightSpeed may be able to create it for you. This depends on the database provider. For providers where LightSpeed cant create databases, youll need to use a suitable database administration tool to create a blank database.)
You can use Update Database to keep your database schema in sync with your model as your model evolves. Update Database does not regenerate tables each time, but instead applies only the changes it detects between the model and the database schema. Hence, it does not modify or delete existing data (except when it detects that, for example, a column needs to be deleted because it is no longer in the model) so it is safe to use with databases containing test data. It is not, however, a production tool! If you want to capture the changes that Update Database detects, so that you can run them against production environments in a controlled way, see the Database Migrations chapter.
26
You can exclude an action proposed by Update Database by clearing the relevant checkbox. However, because the difference between the model and database remains, LightSpeed will continue to suggest the change until you reconcile the difference. Update Database allows you to very rapidly iterate your model. Because it is non-destructive it is easy to tweak the model incrementally until you are happy with it, try out experimental changes, and so on. For more information about this, see Workflows for Rapid Application Development in the chapter Working with Models in the Visual Designer.
When you drag tables from a database, you dont need to specify the entity name and identity type, or the property names and data types, because these are already set up in the database through the tables and columns. LightSpeed works them out from the database schema. You can make changes to an entity which has been dragged from Server Explorer just as if you had created it using the Toolbox. For example, you can add new properties by choosing Add New Entity Property or using the Insert key. Of course, this means your entity class is now out of sync with your database, but you can use Update Database to fix that. Update Database works out what changes need to be made and applies them to the database for you.
27
(In this case, you dont need to have entered a database provider or connection string the way you did when you started from scratch. LightSpeed works them out from the Server Explorer connection.) If you want to keep the database as the master source for the model, you can use the Update From Source command instead of Update Database.
Update From Source updates the model to be in sync with the database schema. For example, if you have added a column to a table, Update From Source will add a corresponding property to the entity. Note however that Update From Source only looks at existing entities this is because a database may contain a huge number of tables not related to the task at hand, and you dont want these cluttering up the model.
28
Like Update Database, Update From Source is non-destructive. For example, if youve applied validations or renamed a property (provided you have kept the database column mapping), Update From Source will not overwrite your changes. Of course, if you have changed something that takes the model out of sync with the database, such as changing a property data type, then Update From Source will propose to change it back. As with Update Database, you can exclude an action proposed by Update From Source by clearing the relevant checkbox, but the action will continue to appear until you reconcile the difference.
This creates a property at each end of the association. The one end has a collection property, representing the collection of associated child entities. The many end has a backreference property, representing the parent entity. In the example above, Customer has an Orders collection property, and Order has a Customer backreference property. LightSpeed guesses names for these
29
properties based on the entity names. You can edit these names by clicking on them in the diagram, or by selecting the arrow and editing the Collection Name and Backreference Name options in the Properties window. In code, you work with these properties in the same way as with other collection and object properties:
// Using a collection property customer.Orders.Add(newOrder); customer.Orders.Remove(cancelledOrder); // Using an entity association order.Customer = requestingCustomer;
A one-to-many association also results in a foreign key property. You cant customise the name or type of this property, because the name is always the backreference name followed by Id for example, CustomerId and the type is always the identity type of the parent entity. You may sometimes use the foreign key property in your code, especially in serialisation scenarios, and you can customise its mapping to a database column if required (see Controlling the Database Mapping). A one-to-one association is added in much the same way as a one-to-many association, except that it has a source and a target instead of a collection and a backreference. When you use Update Database to create or update database tables, LightSpeed creates a foreign key column in the appropriate table to represent each association. If you are creating entities from database tables, then LightSpeed creates associations for you based on foreign keys in the database. Consequently, when you drag a table onto the designer, any columns that are foreign keys do not appear as properties instead, they appear implicitly as the foreign keys of the inferred associations.
Many-to-Many Associations
Many-to-many associations work in a slightly different way to one-to-many or one-to-one associations, because they have to be stored in a different way at the relational level. Instead of a simple foreign key, a many-to-many association requires a whole table of foreign keys. This table is variously known as a join table, relationship table or through table. Each entry in the through table represents a pair of associated entities; and an entity can participate in multiple pairs. A many-to-many association is modelled in LightSpeed using a through association, so named because it goes through an intermediate entity. The intermediate entity corresponds to the through table and is known as the through entity. To create a through association, select the Through Association connector in the toolbox, and drag an arrow between the entities you want to associate. You must then select the arrow and specify the through entity. The easiest way to do this is to enter a name in the Auto Through Entity box: LightSpeed will create a minimal through entity for you. (For more information about this and how
30
to get finer control over the through entity, see the chapter Working with Models in the Visual Designer.)
A through association results in two collection properties, one at each end. LightSpeed guesses names for these based on the names of the entities. When you use a through association from code, youll usually uses these two collections, in just the same way as normal collections you can iterate over them, add items to them, remove items from them, and so on. Using a through association
// iterating over the through association foreach (Tag tag in contribution.Tags) Console.WriteLine(contribution.Title + " is tagged " + tag.Value) // modifying the through association collection contribution.Tags.Add(penguinTag); contribution.Tags.Remove(dromedaryTag); // modifying the other end penguinTag.Contributions.Add(waddlingVideo);
The through association also results in a through entity class and one-to-many associations from the main entity classes to the through entity class (which in turn manifest in code as collection, backreference and foreign key properties). These are visible in code, but most applications dont need to use them. ### TODO: We probably need to discuss more about when you would, and how. ### When you use Update Database to create or update database tables, LightSpeed creates any required through tables, with suitable foreign keys.
31
XML Documentation
Your model can include documentation for entities, properties, one-to-many associations and stored procedures. Documentation will be emitted as XML documentation comments which are displayed in Intellisense or can be built into a Help file using a tool such as Sandcastle. To view or edit documentation, right click on the designer and choose Documentation. The LightSpeed Documentation window is displayed. Depending on the selected entity, this will display different fieldstypically Summary, Remarks and Additional. Enter your documentation into these fields. The Documentation window tracks your selection in the same way as the Properties window.
For most fields, LightSpeed will generate the required documentation tags (e.g. <summary>) for you. However, for fields marked (XML), you must include the container tags yourself. This allows you to generate elements such as <exception> which are not directly represented in LightSpeed. LightSpeed 4.0 supports documentation for most common model elements, but not for some less frequently used elements. Please visit the support forum (linked from the Start menu) if you need to document a model element which does not currently support documentation.
32
Entity classes must have a public default constructor. If you dont specify a constructor, the C# compiler supplies a public default constructor, but if you specify a non-default constructor, youll need to provide a default constructor as well. (See below for Visual Basic considerations.) The persistent state of an entity is defined by its fields. Its very important to understand that LightSpeed is interested in fields, not properties! The designer blurs this distinction, because in most entities, every field is wrapped by a property and every property wraps a persistent field. But when you hand-write code its essential to understand it. For example, it means you mustnt use C# automatic properties, because the backing field for automatic properties is compiler-generated and will have the wrong name:
public class Customer : Entity<int> { public string Surname { get; set; } }
Instead, you must explicitly create a field with the right name. (LightSpeed also permits the underscore prefix.)
33
You can then create a wrapper property so that application code can access the persistent value. The property getter can just return the field value, but the property setter must call the Entity.Set method. The Set method is important because it is how LightSpeed knows that the field has changed. This is essential for knowing that at entity needs to be saved, and to support application interfaces such as IEditableObject and INotifyPropertyChanged. A wrapper property for a persistent field
public class Customer : Entity<int> { private string _surname; public string Surname { get { return _surname; } set { Set(ref _surname, value); } } }
You dont have to provide a wrapper property and LightSpeed wont care if you dont. LightSpeed only cares about the field, and about the Set method being used to modify it. Because LightSpeed cares only about fields, not properties, LightSpeed attributes for example, validation attributes, or attributes that control the database mapping must go on fields rather than properties. (The compiler will warn you if you make a mistake.)
34
EntityHolder<T> must be matched with a foreign key field, whose name is the same as the EntityHolder<T> field followed by Id. For example, if the Order class has a holder named _customer, it must have a scalar field named _customerId. A one-to-many association therefore looks like this in code: Representing a one-to-many association
public class Customer : Entity<int> { private readonly EntityCollection<Order> _orders = new EntityCollection<Order>(); } public class Order : Entity<int> { private readonly EntityHolder<Customer> _customer = new EntityHolder<Customer>(); private int _customerId; }
When application code accesses an association property, you need to ensure that the association is loaded. To do this, call the Get method. This loads the association the collection or the associated entity if required. If the association is already loaded, Get doesnt do anything. Youll usually call Get from a property getter. Application code can update a collection using the Add and Remove methods. To update an entity reference, you must call the Set method. Set updates the contents of the EntityHolder<T> and updates other information such as the foreign key field. The conventional property wrappers for a one-to-many association therefore look like this in code:
35
Remember, as with simple fields, LightSpeed cares only about fields, not properties, so youre not forced to follow this pattern or even to expose association properties at all. One-to-one associations are implemented in the same way as one-to-many associations, except that you use an EntityHolder<T> at both ends. The usage of the EntityHolder<T> is the same, and there must be a foreign key field at one end. Normally, LightSpeed can match associations up with their reverse associations because there is only one association between any two classes. If you have multiple associations between the same pair of classes, you must use ReverseAssociationAttribute to pair them up.
public class Customer : Entity<int> { [ReverseAssociation("Customer")] private readonly EntityCollection<Order> _orders = new EntityCollection<Order>(); } public class Order : Entity<int> { [ReverseAssociation("Orders")] private readonly EntityHolder<Customer> _customer = new EntityHolder<Customer>(); private int _customerId; }
36
Many-to-Many Associations
As mentioned above, a many-to-many association in LightSpeed is represented by a through association. A through association is implemented in terms of a one-to-many association to a through entity and a many-to-one association from the through entity to the target entity. For example, suppose you want to model a many-to-many association between Contribution and Tag entities. You would need a through entity, which we will call ContributionTag, and one-to-many associations from Contribution to ContributionTag and Tag to ContributionTag. Most through entities contain nothing except the associations to the entities being linked, which are represented as the EntityHolder<T> ends of one-to-many associations. As usual the foreign key fields are required as well. So the ContributionTag entity would look like this: Representing a through entity in code
public sealed class ContributionTag : Entity<int> { private int _contributionId; private int _tagId; private readonly EntityHolder<Contribution> _contribution = new EntityHolder<Contribution>(); private readonly EntityHolder<Tag> _tag = new EntityHolder<Tag>(); // Wrapper properties omitted for brevity. Wrapper properties are optional // and could be left out if you do not expect to work directly with through // entities. }
Conversely, Contribution and Tag each have a reverse association which is one-to-many: each Contribution can have any number of ContributionTags and each Tag can also have any number of ContributionTags. Heres the relevant fragment of Contribution: The underlying one-to-many association for a through association
public class Contribution : Entity<int> { private readonly EntityCollection<ContributionTag> _contributionTags = new EntityCollection<ContributionTag>(); // Wrapper property omitted }
You would define a similar association from Tag to ContributionTag. Finally, you can now implement the through association. This is a field of type ThroughAssociation<TThrough, TTarget>. The through association needs to be initialised with the EntityCollection representing the one-to-many association from the source entity (Contribution) to the through entity (ContributionTag). To load the through association, call the Get method. As with other associations, this is usually done in the property getter.
37
Again, the Tag entity will contain similar code for its Contributions through association. The through entity is a true entity in the model, so you can use it to hang data and behaviour relating to the association itself. For example, to track who applied a particular tag to a particular contribution, you could put an AddedBy field on the ContributionTag entity.
You only need to do this if you are hand-coding entities. If you use the designer then it is taken care of for you.
38
Another small wrinkle is that Get and Set, the LightSpeed Entity methods, are reserved words in Visual Basic, and must therefore be escaped with square brackets: Calling the Get and Set methods from Visual Basic
Public Property StatusName() As String Get Return [Get](_statusName) ' note square brackets around Get End Get Set(ByVal value As String) [Set](_statusName, value) ' note square brackets around Set End Set End Property
Again, if you use the designer, it will take care of this for you.
39
Validation
LightSpeed provides a rich, extensible object-level validation framework. An entity may be validated by calling the Entity.Validate method or querying the Entity.IsValid property. Validation errors are exposed at the entity-level through the bindable Entity.Errors collection. Custom validation may be performed by overriding the Entity.OnValidate method. Objects are always validated before they are saved to the database and a ValidationException will be raised if an attempt is made to save an invalid object.
The following validation options are available: Validation Validate Email Validate Format Validate Length Description Ensures that the property contains a valid email address For strings, ensures that the string conforms to the supplied regular expression For strings, validates the string length. Write <= n if the string must be no longer than n characters, >= n if the string must be at least n characters, and n - m if the string must be between n and m characters. Ensures that a value has been provided for the property. For numeric types, this means the value is non-zero; for strings, it means the value is not null or the empty string. Ensures that the property value is unique ideal for email addresses, user
Validate Presence
Validate Unique
40
names, etc. See Validation Considerations below. Validate URI Validate Value Ensures that the property contains a valid URI For numeric values, validates the property value. You can specify a range using the format min - max, or a comparison using the =, !=, <, >, <= and >= operators, which have the same meanings as in C# (e.g. <= n to ensure that the value is less than or equal to n).
To open the LightSpeed Model Explorer, choose View > Other Windows > LightSpeed Model. Then expand the tree view to show the property whose validation you want to customise, and open its Validations folder. You can then modify validations through the Properties grid, and add them by right-clicking the property and choosing Add New validation_type Validation. In particular, to add a custom validation, choose Add New Custom Validation.
41
Note that some validation options in the Properties window may map to different underlying validation objects in the tree view. For example, a Validate Value expression may be implemented as a Range Validation or a Comparison Validation.
Automatic Validations
LightSpeed infers several validations automatically based on the underlying model. Association presence validation ensures that any required associations are present. LightSpeed infers this based on whether the association foreign key field is nullable. DateTime range validation ensures that any DateTime fields fall within a range acceptable for the database at hand.
These validations are built-in and therefore result in built-in messages. To override the message, you must apply a declarative validation in code (see below). Use the ValidateAttribute with a rule type of PresenceAssociationValidationRule or ProviderDateRangeValidationRule as appropriate, and specify your custom message on the ValidateAttribute.
42
The following attributes are available: Attribute ValidateAttribute ValidateComparisonAttribute ValidateEmailAddressAttribute ValidateFormatAttribute ValidateLengthAttribute ValidatePresenceAttribute Description Used to specify a custom validation rule Compares the target field to another value, e.g. less than 100 Ensures that the property contains a valid email address For strings, ensures that the field conforms to the supplied regular expression For strings, ensures that the string length falls between the specified bounds Ensures that a value has been provided for the property. For numeric types, this means the value is non-zero; for strings, it means the value is not null or the empty string. For numeric values, validates that the value is within the specified range Ensures that the property value is unique ideal for email addresses, user names, etc. See Validation Considerations below. Ensures that the property contains a valid URI
Note that the designer merges some of these attributes (for example, the Validate Value option can map to either ValidateComparisonAttribute or ValidateRangeAttribute depending on the particular validation).
Overriding OnValidate
To perform more complicated validation, such as whole entity validation logic, override the Entity.OnValidate method. Report errors by adding them to the Errors collection.
43
Overriding OnValidate
protected override void OnValidate() { if ((Contributor == null) && (ApprovedBy == null)) { Errors.AddError("Must have one or the other"); } }
Validation Considerations
When using the Validate Unique validation (or the ValidateUniqueAttribute), you should be aware that: It incurs a COUNT(*) query against the database each time the validation runs. The designer will create a unique constraint on the column which should make this fast, but you should still be aware that it can cause a lot of queries during, for example, a bulk import. The check is performed against the database, not against other in-memory entities. For example, if you create two new entities with the same value, and save them in the same unit of work, you can bypass the uniqueness validation. Again, a unique constraint at the database level will catch this, but it will result in a database exception rather than a validation exception.
44
Basic Operations
The core operations of LightSpeed are the familiar CRUD (Create, Read, Update, Delete) database operations. The CRUD model lies behind the majority of database-backed applications and Web sites. This chapter shows you how to implement CRUD using LightSpeed.
45
During this process, the unit of work object automatically: Tracks loaded entities in an identity map. Tracks entity state changes, so that it knows which entities if any need to be saved. Updates change tracking and versioning information if required. Connects to the database as and when required.
46
application, though they may of course be different for different instances of the application. Youll therefore normally create your LightSpeedContext object as a static or singleton object theres no benefit, and some disadvantages, to creating a new LightSpeedContext every time you need one.
47
48
The following values are recognised for dataProvider: Provider Name SqlServer2000 SqlServer2005 SqlServer2008 MySql5 PostgreSql8 SQLite3 Oracle9 Oracle9Odp VistaDB3 VistaDB4 SqlServerCE SqlServerCE4 DB2 AmazonSimpleDB Description Microsoft SQL Server 2000 using System.Data provider. Microsoft SQL Server 2005 using System.Data provider. Microsoft SQL Server 2008 using System.Data provider. MySQL 5 database using MySql.Data provider. PostgreSQL 8 database through Npgsql provider. SQLite 3 database through System.Data.SQLite provider. Oracle 9 (or higher) database through System.Data.OracleClient provider. Oracle 9 (or higher) database through Oracle.DataAccess provider. VistaDB 3 database through VistaDB.NET20 provider. This provider has been deprecated and is no longer officially supported. VistaDB 4 database through VistaDB.4 provider. SQL Server Compact 3.5 through System.Data provider. SQL Server Compact 4 through System.Data provider (requires additional supporting assembly see Working with Database Providers chapter). DB2 9.5 through IBM.DB2 provider. Amazon SimpleDB cloud database or compatible.
49
See the chapter Controlling the Database Mapping for more information on these options.
50
(In future we wont normally show the LightSpeedContext. Weve shown it here because, when you use LINQ, its important to remember to use the strong-typed generic version of LightSpeedContext.)
51
When you write a LINQ query against a LightSpeed query property, the query is translated to SQL and executed on the database. For example, the LINQ where clause is translated to a SQL WHERE clause. This means processing is efficient for example, LINQ does not bring back all Order entities and filter them on the client.
52
// Method 1: Declare your own strong-typed unit of work class public class StoreUnitOfWork : UnitOfWork { public IQueryable<Order> Orders { get { return this.Query<Order>(); } } } // Method 2: Call Query explicitly on a weak-typed unit of work IUnitOfWork unitOfWork; // weak typed
To perform paging of a query, use the Skip and Take extension methods. If you dont also specify an order, either explicitly in the LINQ query or implicitly on the entity class, Skip and Take order entities by Id. You can combine Skip and Take if you want to page through a result set.
var ordersToDisplay = unitOfWork.Orders .OrderBy(o => o.OrderDate) .Skip(pageStart) .Take(pageCount);
To work with the entities returned from a LINQ query, use the foreach keyword to iterate over the query, or use the ToList extension method to load the results into a list.
53
If you are only interested in a single entity, apply the First or Single extension method to obtain it. First returns the first matching entity, ignoring any others; Single checks that there is only one matching entity.
List<Order> allOrders = unitOfWork.Orders.ToList(); Order order = unitOfWork.Orders.Single(o => o.Id == orderId);
If you want to know how many entities fit the query criteria, apply the Count extension method. If you want to know if any entities fit the query criteria, apply the Any extension method.
int pendingOrderCount = unitOfWork.Orders .Where(o => o.Status == OrderStatus.Pending) .Count();
To perform a projection that is, to select only a subset of the entity fields use the select keyword or the Select extension method. If you perform a projection, then you will typically project into a non-entity type, and the data will not be associated with the unit of work or cached in the identity map, and changes to the object will not be saved when the unit of work is flushed. This is therefore typically used for presenting partial, read-only information about an entity.
var orderSummaries = from o in unitOfWork.Orders select new { OrderId = o.Id, o.OrderReference };
All of these methods are translated to SQL so that LightSpeed does not waste time and bandwidth pulling back unwanted rows or columns. For example, if you specify Take(5) then LightSpeed will limit the number of rows returned to 5; if you specify Count() then LightSpeed issues a SQL COUNT query rather than materialising entities on the client. See also Advanced Querying Techniques later in this book.
LINQ Expressions
LINQ allows you to write queries of arbitrary complexity. LightSpeed handles only queries that can be translated to SQL on the database at hand. Consequently, if you write complex queries, you may encounter NotSupportedException at runtime. This indicates that LightSpeed was not able to translate the LINQ query to SQL. Consider simplifying the query, and performing further operations on the client. You can use the ToList() and AsEnumerable() operators to partition work between the database and the client. For known limitations on what LINQ expressions LightSpeed can translate to SQL, see the Appendices.
54
Query Expressions
The most commonly used query object is the QueryExpression object. You can pass a QueryExpression directly to IUnitOfWork.Find<T> to load entities by criteria, like the LINQ Where operator. To create a QueryExpression, use the Entity.Attribute static method to represent the attribute you want to query on, then apply comparison operators such as ==, <, and so on. Using a QueryExpression to query by criteria
IUnitOfWork unitOfWork; // weak typed
You can combine query expressions using Boolean operators such as && and ||:
55
Query expressions also support the In, Like and Between methods for criteria that cant be represented using the built-in operators:
IList<Order> orders = unitOfWork.Find<Order>(Entity.Attribute("CustomerId").In(1, 3, 5));
Query expressions support traversal into associated entities using the dot syntax:
IList<Order> orders = unitOfWork.Find<Order>(Entity.Attribute("Customer.Name") == "Bob");
Query Objects
In some cases you need to specify additional querying options over and above the criteria, sort order and paging. In these cases, you must create a Query object and pass this to Find. The Query object allows you to specify projections, perform full text searches and customise entity load graphs. Specific functions are covered in the relevant sections of this user guide, or see the Query object in the API reference.
56
Always use FindById for identity lookups, because it tries the lookup in the unit of works identity map first, and queries the database only if the lookup fails. This greatly improves efficiency if the entity is already part of the unit of work.
Count Queries
If you only need to know how many entities meet your query criteria, without bringing back those entities, you can use the Count method instead of Find.
long orderCount = unitOfWork.Count<Order>(new Query( Entity.Attribute("Customer.Name") == "Bob"));
The Count method doesnt have an overload that allows you to pass a QueryExpression; you must pass a full Query object. You can pass a QueryExpression in the Query constructor.
57
If a new entity is associated with another entity that is already part of a unit of work, it automatically becomes part of the same unit of work. This saves you having to remember to add the entity to the unit of work separately. Any kind of association will trigger this. Creating a new entity which implicitly becomes part of the unit of work
Customer customer = unitOfWork.FindById<Customer>(customerId); Order order = new Order { OrderReference = orderRef }; customer.Orders.Add(order); // implicitly adds order to the same unit of work as customer
Note that because LightSpeed saves units of work, you must add the entity to a unit of work whether explicitly or implicitly in order for it to be saved. Just creating the entity is not enough! As part of adding the entity to a unit of work, LightSpeed assigns an Id to the entity. Before a new entity becomes part of a unit of work, its Id is invalid and should not be used.
58
LightSpeed automatically determines that the entity has changed, and marks it to be saved.
If an entity has dependent associations, LightSpeed cascade deletes the dependent entities. This avoids database integrity errors due to foreign key constraints. For example, if every Order is associated with a Customer, and you delete a Customer, then all Orders associated with that Customer are also deleted. However, if the association is not dependent that is, if your model allows dangling Orders not associated with a Customer then the Orders are merely detached from the Customer and remain in the database. You can override the default cascade delete behaviour by setting LightSpeedContext.CascadeDeletes in code or the cascadeDeletes attribute in configuration, or by setting the Cascade Deletes option on an entity (which affects all associations where that entity is the parent), or by setting the Is Dependent option on an individual association. In all cases, remember that entities are not deleted from the database immediately, but will be deleted when you save the unit of work.
59
SaveChanges validates all entities that are due to be added or updated, and will not save an invalid entity. (Invalid entities may be deleted.) By default, the saved entities remain part of the unit of work, in case you want to carry out more changes on them. You can remove them, forcing LightSpeed to reload fresh copies, by calling the SaveChanges(bool reset) overload and passing true. However, it is usually clearer to start a new unit of work for the new batch of activity. After you have finished with a unit of work, you must call Dispose.
60
Transactions
SaveChanges is automatically transactional: that is, if more than one entity needs to be saved, LightSpeed guarantees that the changes to the database will be atomic and durable (provided the database supports transactions). To achieve this, SaveChanges automatically begins a transaction before sending the first change, and commits it after sending the last change if all changes have been successful. As far as LightSpeed is concerned, however, each SaveChanges is an independent transaction. If you need multiple LightSpeed operations to be part of a single transaction, you must specify that transaction yourself.
Using TransactionScope
The easiest way to control transactions with LightSpeed is to use the .NET TransactionScope class. To do this, simply surround any calls to IUnitOfWork with the standard TransactionScope block: Using a system transaction with a LightSpeed unit of work
using (var transactionScope = new TransactionScope()) { using (var unitOfWork = _context.CreateUnitOfWork()) { var contribution = unitOfWork.FindById<Contribution>(1); contribution.Description = "A description"; unitOfWork.SaveChanges(); } transactionScope.Complete(); }
61
Automatic Transactions
Remember that you only need to manually specify a transaction if you need to coordinate at a larger scope than a single SaveChanges. LightSpeed automatically ensures that all database flush operations run within a transaction. In most cases therefore you will not need to create transactions explicitly.
62
63
64
65
If you have dragged a table into the model from a database, and youre dissatisfied with the inferred names, you can quickly rename an entity or field while retaining its mapping by using the Refactor > Rename command and checking the Keep existing name as database table/column name option. If you have dragged a table into the model from a database, and the primary key column has a name other than Id, LightSpeed will automatically create the identity column mapping for you.
66
All members of INamingStrategy get passed a defaultName by LightSpeed. If you want to accept the default convention in a particular case, just return defaultName. Once you have implemented a naming strategy to represent your convention, you must tell LightSpeed to use it. To do this, set LightSpeedContext.NamingStrategy in code, or the namingStrategyClass attribute in configuration. When setting the naming strategy in configuration, you must provide a full assembly-qualified type name. Specifying the Hungarian convention in configuration
<add name="Test" namingStrategyClass="MyApp.HungarianNamingStrategy, MyApp" />
67
Reserved Words
Occasionally you will want to use a name in your domain model which is a reserved word in SQL. The classic example is Order, which clashes with SQLs ORDER BY keywords. To prevent errors in this case, set the quoteIdentifiers configuration attribute, or set LightSpeedContext.QuoteIdentifiers in code.
68
Examples
Overriding persistence behaviour
[Transient] private EditStatus _editStatus; private readonly decimal _discountedPrice;
69
Identity Generation
Every LightSpeed entity has an Id, which LightSpeed uses to uniquely identify the entity. When a newly created entity becomes part of a unit of work for the first time, LightSpeed assigns it an Id. LightSpeed has a number of ways of generating Ids for entities. The default identity method is KeyTable, which is an efficient and portable way of generating numeric Ids. This section describes how to override the default, and the alternative options.
70
71
increment amount. To do this, set LightSpeedContext.IdentityBlockSize in code, or the identityBlockSize attribute in configuration. If the identity block size is different from the sequence increment amount, this will cause errors. If you use the Sequence method, you must create the sequence in the database. To do this, run the Sequence.sql schema file for your database, which you can find in your LightSpeed install directory. The provided Sequence.sql assumes the default identity block size of 10; if you change the identity block size you must change the sequence increment amount and vice versa. If you use the MultiSequence method, you must create each of the sequences you intend to use. MultiSequence is typically used with existing databases with a sequence per table policy, in which case the sequences will presumably already exist. If you use MultiSequence, you must also implement a naming strategy to tell LightSpeed which sequence to use for each table. Your naming strategy class must implement the INamingStrategy interface, and return the per-table name from GetMultiSequenceName. All other members can return the default name. Specifying sequence names for the MultiSequence identity method
public class SequencePerTableNamingStrategy : INamingStrategy { public string GetMultiSequenceName(string defaultName, Type entityType) { // Example naming convention: seq_employee_ids, seq_order_ids, etc. return "seq_" + entityType.Name.ToLower() + "_ids"; } // other members can all return defaultName (unless using a custom mapping policy) }
See Defining Your Own Mapping Convention above for how to make LightSpeed use your naming strategy.
72
value is inserted. Consider GuidComb if you want GUID Ids, have a very large data set and are performing a lot of inserts.
73
unit of work per Web request, even though the units of work shared the same LightSpeedContext object.
74
You can also query a view using query objects, bypassing the designer and the generated helper property. To do this, set Query.ViewName.
75
In either case, the view must have the same schema as the table because you are loading it into the same type of entity. Entities loaded through a view can be modified or deleted in the normal way, with the changes being applied via the underlying table.
76
If the stored procedure has output or input-output parameters, these will appear as out or ref parameters to the method.
77
The query object API uses ProcedureParameter objects to represent stored procedure parameters. You can declare parameters with direction Output, InputOutput or ReturnValue to receive values returned by the stored procedure through parameters or as the return value.
When using stored procedures on Oracle, you must follow a special convention for returning results. See the chapter Working with Database Providers for more information.
78
79
Configuration
You can configure LightSpeed through code or through the configuration file (web.config or appname.exe.config depending on the type of application). We discussed the core configuration options in the Basic Operations chapter. In this section we will review other configuration options. For a full list of configuration settings, see the Appendices.
Individual configuration options are specified as attributes of the add tag. A few settings are not available through the configuration file. These settings can only be configured in code. These are typically advanced settings such as custom strategy classes which do not need to change between environments.
80
You can still modify the context object in code by setting its properties, for example if you need to apply a setting which is not available through configuration.
This will not load any settings from the configuration file. You can then set up the context object by settings its properties. You will usually use code-only setup only for prototypes and tests, because it leads to maintenance headaches when the code needs to be moved between environments.
Terminology
If youre familiar with LINQ to SQL or the Entity Framework, you may associate the term context with a domain model or database session, as in the LINQ to SQL DataContext. A LightSpeedContext is not like a DataContext: it holds configuration settings, not database data. The LightSpeed equivalent of a DataContext or ObjectContext is a unit of work.
81
Configuration Settings
There are a lot of configuration settings on LightSpeedContext, which are described in more detail in the relevant section and listed in the Appendices. This section lists some of the most commonly used. Attribute connectionStringName Usage The database connection string. The actual string is looked up from the name in the <connectionStrings> section. See the Basic Operations chapter for more information. The type of database engine. See the Basic Operations chapter or the DataProvider enumeration for a list. Indicates that LightSpeed should map entity classes to table names using the plural form of the class name. For example, a Person class would normally map to a Person table, but with pluralizeTableNames in effect it would map to a People table. Quotes table and column names in SQL. This avoids conflicts with reserved words, but causes some databases to behave in a case-sensitive way, which can lead to mapping problems. The database schema in which the entity tables are found. Specifies how LightSpeed assigns Ids to new entities. See Identity Generation in the Controlling the Database Mapping chapter. The type of logger to which LightSpeed should log SQL statements and diagnostic and performance information. See Logging in the Testing and Debugging chapter for more information.
dataProvider pluralizeTableNames
quoteIdentifiers
All of these (and the other) configuration settings have equivalent properties on the LightSpeedContext class.
82
To use this strategy, set LightSpeedContext.DisplayNamingStrategy in code, or the displayNamingStrategyClass attribute in configuration. Specifying a display naming strategy in configuration
<add name="Test" displayNamingStrategyClass="MyApp.ResourceLookupStrategy, MyAssembly" />
83
84
85
Implementing ConnectionStrategy
public class CloseableConnectionStrategy : ConnectionStrategy { private IDbConnection _connection; public CloseableConnectionStrategy(LightSpeedContext context) : base(context) { } protected override IDbConnection Connection { get { if (_connection == null) { _connection = Context.DataProviderObjectFactory.CreateConnection(); _connection.ConnectionString = Context.ConnectionString; _connection.Open(); } return _connection; } } public void CloseConnection() { if (_connection != null) { _connection.Dispose(); _connection = null; } } protected override void Dispose(bool disposing) { if (disposing) { if (_connection != null) { _connection.Dispose(); _connection = null; } } base.Dispose(disposing); } }
86
You must do this before the unit of work first connects to the database otherwise the unit of work will use the default connection strategy. Do not change connection strategies in the middle of a unit of work. If desired, you can set up the connection strategy in the constructor of a strong-typed unit of work, or through LightSpeedContext.UnitOfWorkFactory. Note that the unit of work will not invoke any actions on your custom strategy other than to get connections and to notify various operations. It is up to application code to do this. For example, if you want to use the CloseableConnectionStrategy to close the connection after loading a screen, you might write something like this:
private void OnLoad() { LoadDataToScreen(); // We know we won't be needing the database again for a while, so drop the connection ((CloseableConnectionStrategy)(_unitOfWork.ConnectionStrategy)).CloseConnection(); }
The connection is released, but the unit of work remains live, and will automatically reinitiate a connection when required for example, if you load new entities, traverse an association, or call SaveChanges. Note that some LightSpeed tasks, such as traversing an association, internally require a connection. Your connection strategy will still be used in such cases, but your application code may not be aware of it. For example, a data binding that traverses an association could cause your strategy to reinitiate a connection: this connection would remain open, and your application code would not know about it in order to reclaim it. If this is a concern, you can override ConnectionStrategy.OnDatabaseOperationComplete to receive notifications of LightSpeed database activity, but care is required that you do not close a connection while LightSpeed is still using it.
87
88
We would then recommend that you set up a base class for your pages (or if you are using a controller/presenter approach then in your base class for your controllers/presenters) which can leverage the PerRequestUnitOfWorkScope<TUnitOfWork> to provide you with access to your UnitOfWork. Example use of PerRequestUnitOfWorkScope<TUnitOfWork> in your base class
public class PageBase : System.Web.UI.Page { private PerRequestUnitOfWorkScope<ModelUnitOfWork> _unitOfWorkScopeHolder; public PageBase() { _unitOfWorkScopeHolder = new PerRequestUnitOfWorkScope<ModelUnitOfWork>(Global.LightSpeedContext); } }
The PerRequestUnitOfWorkScope class holds instances of your typed UnitOfWork instances in the HttpContext.Items collection so they will be available for the duration of the request. The UnitOfWork will only be instantiated once you call the .Current property on the PerRequestUnitOfWorkScope instance so it is safe to instantiate the holder class early and set up a property accessor to pass through to the .Current property on the holder.
89
Finally you will need to dispose the UnitOfWork instance at the end of the request to ensure the database connection is released. We recommend that this is done as part of the EndRequest event and that you hook this in your Global.asax.cs Example of PerRequestUnitOfWorkScope<TUnitOfWork> disposal code
protected void Application_Start(object sender, EventArgs e) { EndRequest += new EventHandler(OnEndRequest); } void OnEndRequest(object sender, EventArgs e) { var scope = new PerRequestUnitOfWorkScope<ModelUnitOfWork>(LightSpeedContext); if (scope.HasCurrent) { scope.Current.Dispose(); } }
Validation
LightSpeed provides a rich, extensible object-level validation framework. For more details about the validation framework in LightSpeed please review the Validation section in the Creating Domain Models chapter. When you approach validation with ASP.NET you will be interested in handling both client and server side validation concerns. You will implement your client side validation concerns manually and then you can use the .IsValid property in your postback events to determine if your updated LightSpeed entity is valid.
90
You may also wish to surface the validation error messages for your entity and bind those to a summary on the page. The pattern we would generally recommend for this is to expose an Errors property on your page which returns a LightSpeed ValidationErrorsCollection instance. This can then be returned from any of your entities which are invalid.
91
92
An example describing a two way data-binding between two fields on a form and a Member entity which has been exposed on the page instance
<lightspeed:EntityDataBinder runat="Server" ID="DataBinder"> <lightspeed:EntityDataBindingItem runat="Server" BindingMode="TwoWay" BindingSource="Member" BindingSourceMember="Username" TargetControl="SignUpUsername" TargetControlProperty="Text" /> <lightspeed:EntityDataBindingItem runat="Server" BindingMode="TwoWay" BindingSource="Member" BindingSourceMember="FirstName" TargetControl="SignUpFirstName" TargetControlProperty="Text" /> </lightspeed:EntityDataBinder>
Like other data-bound controls in ASP.NET you need to explicitly call .DataBind() on the control to ask it to perform a data-binding operation. Because the source data is specified as part of the declarative mark-up for the control there is no DataSource property which needs to be assigned. For unbinding data back to your entities you will need to call .Unbind() in your postback events after which it is sensible to validate your objects and then deal with any resulting errors.
93
Samples
If you want to review some sample code, have a look at the Aptitude Test sample which is part of the Visual Studio 2008 samples solution.
94
We would recommend that you leverage the LightSpeedControllerBase<TUnitOfWork> class which is provided as part of the Mindscape.LightSpeed.Web assembly as the base for your controllers as this will provide a per request scoped UnitOfWork which uses PerRequestUnitOfWorkScope<TUnitOfWork> and which will handle the disposal of any UnitOfWork instances on your behalf by default. Example controller declaration using LightSpeedControllerBase<TUnitOfWork>
public class MyController : LightSpeedControllerBase<ModelUnitOfWork> { protected override LightSpeedContext<ModelUnitOfWork> LightSpeedContext { get { return MvcApplication.LightSpeedContext; } } }
By default the LightSpeedControllerBase will dispose of any created UnitOfWork instance as part of the OnResultExecuted handler or alternatively when the controller instance is disposed. If you wish to manually control the disposal behaviour, for example you may wish to dispose it at the end of the request to allow further access to the UnitOfWork instance through an external PerRequestUnitOfWorkScope<TUnitOfWork>, then you will need to set the DisposeUnitOfWorkOnResultExecuted property to false to disable the automatic behaviour.
95
96
One it has been set up your entities can be unbound either when unbinding takes place for an action argument or manually when you call UpdateModel. The main limitation to be aware of with the LightSpeedEntityModelBinder is that it will not traverse any loaded associations. An alternative to using the custom model binder provided with LightSpeed is to create your own model binder, likely sub-classing DefaultModelBinder to afford yourself the benefits it already provides. If you take this approach then you will primarily be interested in correctly handling any errors after the model binding takes place. This can be achieved as shown below. An example of implementing BindModel when sub-classing DefaultModelBinder and handling extracting error messages from your LightSpeed entity
public override object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext) { object result = base.BindModel(controllerContext, bindingContext); if (typeof(Entity).IsAssignableFrom(bindingContext.ModelType)) { Entity entity = (Entity)result; if (!entity.IsValid) { foreach (var state in bindingContext.ModelState.Where(s => s.Value.Errors.Count > 0)) { state.Value.Errors.Clear(); } foreach (var error in entity.Errors) { if (error.ErrorMessage.EndsWith("is invalid")) continue; bindingContext.ModelState.AddModelError(error.PropertyName ?? "Custom", error.ErrorMessage); } } } return result; }
Validation
When you approach validation with ASP.NET MVC you will be interested in handling both client and server side validation concerns. The standard approach for handling validation concerns with ASP.NET MVC is to use Data Annotation attributes. While LightSpeed has its own native validation framework you can use the LightSpeedMvcValidatorProvider which is part of the Mindscape.LightSpeed.Web assembly to automatically describe the appropriate Data Annotation attributes needed to allow automatic validation to be applied by ASP.NET MVC. To make use of this you need to first add a new instance of the LightSpeedMvcValidatorProvider to your ModelValidatorProviders.Providers collection, typically this would be performed in your Global.asax.cs.
97
The provider does not surface all LightSpeed validations because the Data Annotations framework does not contain equivalents for all of them. The following LightSpeed validations are surfaced: ValidatePresenceAttribute ValidateLengthAttribute ValidateRangeAttribute (int and double ranges only) ValidateFormatAttribute
For server side validation you will likely combine the use of a model binder with the integrated LightSpeed validation to check if an entity is in a valid state following a binding operation. Again for this we would check .IsValid. If you are using a model binder as described in the previous section then any errors will be assigned into ModelState so you can display these when you refresh your view. An example of checking IsValid following a binding operation using UpdateModel
try { UpdateModel(myEntity); } catch (InvalidOperationException) { updateModelErrored = true; } if (updateModelErrored || !myEntity.IsValid) { // handle validation/binding failure }
Samples
If you want to review some sample code, have a look at the TV Guide MVC 2 sample which is part of the Visual Studio 2008 samples solution or the Deals MVC 3 sample which is part of the Visual Studio 2010 samples solution.
98
99
Find any references on the page to asp:LinqDataSource and change them to ls:LightSpeedLinqDataSource. Do not change any attributes or other content. Declare a LightSpeedContext<MyModelUnitOfWork>. This is typically declared as a static read-only field in Global.asax.cs. Example declaration of a static LightSpeedContext in Global.asax.cs
public static LightSpeedContext<MyModelUnitOfWork> LightSpeedContext = new LightSpeedContext<MyModelUnitOfWork>("Development");
In Global.asax.cs, locate the section entitled IMPORTANT: DATA MODEL REGISTRATION and add the following line in place of the commented-out call to model.RegisterContext: Dynamic Data Model Registration
model.RegisterContext( new LightSpeedDataModelProvider<MyModelUnitOfWork>(LightSpeedContext), new ContextConfiguration { ScaffoldAllTables = true, MetadataProviderFactory = t => new EntityDataAnnotationProvider(new AssociatedMetadataTypeTypeDescriptionProvider(t)) } );
Once you have completed these steps your Dynamic Data site will be configured to use the LightSpeed data source. You can then customise the appearance of your site and your entities using the usual Dynamic Data customisation techniques.
100
Handling Associations
When presenting associations, ASP.NET Dynamic Data calls ToString() on the associated object. The ToString() implementation in the Entity class returns a generic string which is not very friendly to end users. You should override ToString() in any class you want to display on a Dynamic Data site.
Validation
By default, ASP.NET Dynamic Data does not recognise LightSpeed validation attributes. Although validations still run, they run only on the server and ASP.NET Dynamic Data does not display validation error messages in a usable form. To enable integrated ASP.NET Dynamic Data validation, including client-side validation and error message display, when registering the data context, set the ConfigurationContext.MetadataProviderFactory to a call back which creates an instance of EntityDataAnnotationProvider. You will need to add a reference to Mindscape.LightSpeed.Web to bring this class into scope. Dynamic Data Model Registration
model.RegisterContext( new LightSpeedDataModelProvider<TestUnitOfWork>(LightSpeedContext), new ContextConfiguration { ScaffoldAllTables = true, MetadataProviderFactory = t => new EntityDataAnnotationProvider(new AssociatedMetadataTypeTypeDescriptionProvider(t)) } );
(Please notice the use of AssociatedMetadataTypeTypeDescriptionProvider as an inner provider. This is required to enable buddy classes to work in cases where these are needed.) EntityDataAnnotationProvider does not surface all LightSpeed validations because ASP.NET Dynamic Data does not contain equivalents for all of them. Validations which are not surfaced still run, but without the integrated user experience. The following LightSpeed validations are surfaced: ValidatePresenceAttribute ValidateLengthAttribute ValidateRangeAttribute (int and double ranges only) ValidateFormatAttribute
In addition, EntityDataAnnotationProvider depends on the use of designer naming conventions to locate validations (because LightSpeed validations are specified on fields but must be surfaced on properties). That is, the backing field for each property must have the same name as the property, prefixed with an underscore. If you have hand-coded entities which do not follow this convention, you will need to add Dynamic Data validation attributes to your properties by hand.
101
102
103
104
105
106
107
108
LightSpeed supports all three of these approaches. Lastly, LightSpeed also provides support for exposing LightSpeed entities through RIA Services and Dynamic Data. You can find more information about building applications with RIA Services in the chapter on Building Silverlight Applications and you can find more information about building applications with ASP.NET Dynamic Data in the chapter on Building Web Applications.
109
A distributed unit of work is represented in LightSpeed as the IDistributedUnitOfWork interface. IDistributedUnitOfWork mirrors the IUnitOfWork interface and provides an identical set of operations which allow you to load, add and remove entities, and to save pending changes. IDistributedUnitOfWork is only limited on operations which deal with database specific entities such as IDbReader and IDbCommand, in cases where functionality is not supported a NotSupportedException will be thrown and these specific methods have been noted in the XML documentation for the interface.
110
To create a DistributedUnitOfWork a LightSpeedContext object must be instantiated with the UnitOfWorkFactory set to a new instance of a DistributedUnitOfWorkFactory class. The factory is used to determine how to connect to the service endpoint using WCF and by default will initiailize itself using application configuration based on the endpoint named LightSpeedDistributedUnitOfWorkEndpoint . Alternatively the factory can be instantiated with a WCF EndpointAddress and Binding if you require runtime configuration.
The DistributedUnitOfWorkService
When the DistributedUnitOfWork runs on the client, there needs to be a service endpoint capable of servicing requests targeting the model being used by the client. To host this there is a DistributedUnitOfWorkService class provided which allows you to host an endpoint using WCF. There are two ways this could be hosted, either you will manually host an actual instance of the service within a dedicated process, for example a Console application or a Windows Service; or you will have the service activated by IIS using Windows Activation Services and a .svc file hosted in an ASP.NET website. Both approaches are supported.
You will also need to specify configuration for the endpoint as part of your system.serviceModel configuration section. Here is an example using a basicHttpBinding. The contract which is used is the Mindscape.LightSpeed.ServiceModel.IDistributedUnitOfWorkContract.
111
If you wish to have a new instance of the DistributedUnitOfWorkService created on a per WCF instance basis, then you will need to subclass DistributedUnitOfWorkService with a Manually hosting a DistributedUnitOfWorkService in multi instance mode
public class MyUnitOfWorkService : DistributedUnitOfWorkService { public MyUnitOfWorkService() : base(new LightSpeedContext("Development")) { } } using (var host = new ServiceHost(typeof(MyUnitOfWorkService))) { host.Open(); Console.WriteLine("Host has started - press ENTER to shut down"); Console.ReadLine(); }
As with hosting in single instance mode you will need to specify configuration about the endpoint. Here is an example. The key difference from the example above is that we need to ensure our service type name is specified in the name attribute for the service configuration. The address, binding and contract are all the same. Example configuration for a DistributedUnitOfWorkService hosted in multi instance mode
<system.serviceModel> <services> <service name="Mindscape.LightSpeed.Samples.MyUnitOfWorkService"> <endpoint address="http://localhost:3000/UnitOfWork" binding="basicHttpBinding" contract="Mindscape.LightSpeed.ServiceModel.IDistributedUnitOfWorkContract"/> </service> </services> </system.serviceModel>
112
Supported Bindings
Because the DistributedUnitOfWork has been built on WCF it should support any of the standard transports and bindings available within WCF. From a support perspective however the implementation has been developed and tested on only the Named Pipe, TCP and HTTP bindings.
113
Data Contracts
If the project containing your LightSpeed model references System.ServiceModel then the Visual Studio Designer will automatically emit DataContract and DataMember attributes to mark up your model classes allowing them to be correctly serialized by the WCF DataContract formatter. By default all value properties are marked with a DataMember attribute including foreign key identity fields, but all associations are omitted from serialization. This is to allow you to more correctly specify what structures should be serialized in distributed scenarios rather than potentially opting in the whole domain model for serialization. It is highly recommended that you review which associations should be serialized when designing your model for distributed scenarios.
114
Samples
There are two samples available which make use of the DistributedUnitOfWork which will give you a practical view of how the DistributedUnitOfWork can be used across the two standard types of applications you are likely to be using it with. The first sample is the ATM sample which is part of the Visual Studio 2008 samples solution. The Teller project is an example of a WPF application which uses the DistributedUnitOfWork as part of a long running stateful client application. The second sample is the Film Festival sample which is part of the Visual Studio 2010 samples solution. The Website uses the DistributedUnitOfWork in a per request fashion as part of a stateless ASP.NET MVC web application. For a sample which demonstrates building a hand crafted WCF service which exposes entities then you should review the ATM sample which is part of the Visual Studio 2008 samples solution. The ATMClient console application project makes use of several service calls which deal with entities which have been shared with the Website service.
115
Given this model we might wish to represent a data transfer object which provides the details of a member to form part of our published service contract.
116
We need to achieve several steps to deal with data transfer objects within our system. We need to project LightSpeed Member entities into ApplicationMember instances so we can send these objects out of our system and over the wire. We need to allow inbound ApplicationMember to be mapped back to Member entities. We need to support both creating new instances and updating existing instances.
The first step can be achieved by either creating a mapping function or by using LINQ to project the data as required. Assuming you are able to use LINQ then this would be the preferred approach. Example of projecting Member entities to ApplicationMember instances
var members = unitOfWork.Members.Select(m => new ApplicationMember() { Id = m.Id, Email = m.Email, UserName = m.UserName }).ToList();
The second and third steps can be achieved by using IUnitOfWork.Import which provides functionality for mapping arbitrary data transfer objects back against the entity store. It will conditionally create new entities or update existing entities based on the value of the Id property held on the data transfer object. Calling IUnitOfWork.Import
members.ForEach(m => unitOfWork.Import<Member>(m));
In our example the call to the Import method returns the actual Member entity instance that was attached to the UnitOfWork. As with the example above we do not make use of this, but if you need to perform any subsequent processing against the entity then it is available for use.
117
The default mapping approach used by the Import method comes with two caveats. 1. We expect that the data transfer object will have an Id property which we can use to check if there is already an entity with that Id in the database, and load it if that is the case so we can perform an update. If no Id property is present then that aspect of the mapper will be ignored and you will always be dealing with new entities. 2. The mapper will use reflection to assign properties of the data transfer object which exactly match the value fields on the entity. In the example you will notice that ApplicationMember has a case sensitive match with the Member entity for the property UserName. If the property does not exactly match, then assignment will not happen.
118
Samples
If you wish to review a practical sample which uses Data Transfer Objects with LightSpeed then you should review the ATM sample which is part of the Visual Studio 2008 samples solution. The ATMClient console application project makes use of several service calls which rely on DTOs which have been defined in the Contracts project.
119
120
Unit Testing
### JD to do this bit ###
121
Logging
To aid in development, LightSpeed can be configured to log diagnostic messages. The logger is any object that implements the ILogger interface: this gives you the flexibility to log according to the requirements of the application and environment. For example, you can log to the Visual Studio Output window during development, then turn off logging in production. Or you could create a custom logger which would allow you to selectively turn on logging for production diagnostics.
Built-In Loggers
The framework provides two built-in loggers: ConsoleLogger writes logs to the console window. TraceLogger writes logs to all application trace listeners. Trace listeners are defined by the .NET Framework and can be specified in the configuration file via the <system.diagnostics> element. The .NET trace infrastructure also supports filtering of messages.
Enabling Logging
To enable logging, set LightSpeedContext.Logger in code, or the loggerClass attribute in configuration. When setting the logger class in configuration, you must provide a full assembly-qualified type name. Enabling logging in configuration
<add name="Test" loggerClass="Mindscape.LightSpeed.Logging.TraceLogger, Mindscape.LightSpeed" />
You can display additional logging information by setting LightSpeedContext.VerboseLogging. This is not available in configuration because it should be set only during debugging.
122
Disabling Logging
To disable logging after it has been turned on, set LightSpeedContext.Logger to null.
_context.Logger = null;
Note that the VerboseLogging setting does not affect what information is available in the CommandLog object. CommandLog always contains full logging information. VerboseLogging affects only how much information is included in CommandLog.ToString().
123
Profiling
You can use the logging infrastructure to perform simple profiling.
Query Patterns
You can review the SQL logs to determine what queries LightSpeed is issuing and to look for inefficient patterns such as N+1 lazy loads. When looking for performance bottlenecks, it can be wise to identify a piece of code you want to profile, and turn logging off after executing that code otherwise you may end up with a very large SQL trace, making it to track down whether you have traced a single inefficient run of code or a large number of efficient runs!
Timing Queries
When LightSpeed logs a SQL statement or batch, it also prints out the time taken to execute that statement. You can also access the time taken directly using the CommandLog.TimeTaken property.
124
To install the visualizer for Visual Studio 2010: Copy Mindscape.LightSpeed.DebuggerVisualizer.dll to My Documents\Visual Studio 2010\Visualizers or (Program Files)\Microsoft Visual Studio 10.0\Common7\Packages\Debugger\Visualizers.
To use the visualizer: Start a debug session. Examine a LightSpeed Query object in a DataTip, a debugger variables window or the QuickWatch dialog. You will see a magnifying glass icon next to the text. Click on the magnifying glass icon to display the query SQL.
The debugger visualizer supports only query objects, not LINQ queries.
125
126
Inheritance
Inheritance, when used wisely, can greatly assist in developing expressive, maintainable domain models that neatly capture potentially complex domain behaviours. LightSpeed supports inheritance through the popular single table inheritance and class table inheritance patterns, which encapsulate different ways of mapping an inheritance hierarchy to a relational model. Regardless of the database mapping, the inheritance relationship is modelled using the familiar .NET inheritance feature. To derive one entity class from another: Designer: Choose the Inheritance arrow from the toolbox and drag an arrow from the derived to the base entity; or select the derived entity, go to the Properties grid, and set its Base Class to the desired entity type. Code: Use the normal C# or Visual Basic inheritance syntax.
Discriminators
The key aspect of implementing single or class table inheritance is to define a discriminator column. A discriminator column is simply a column used by LightSpeed to determine the type of entity to instantiate when loading a row from the underlying table. Quite often this column is also a foreign key to an associated reference data table. E.g. Employee has an EmployeeTypeId. Every inherited class in single or class table inheritance must specify a discriminator column. All classes in an inheritance hierarchy must specify the same discriminator column. Furthermore, each class in the hierarchy (except the root class) must specify a discriminator value. This tells LightSpeed, If the discriminator column contains this value, materialise the row as this type of entity. Each class must therefore specify a different discriminator value. If LightSpeed encounters a row which doesnt match any of the derived class discriminator values, it treats it as an instance of the root class. To specify the discriminator for a derived class: Designer: Select the inheritance arrow from the derived to the base class, and fill out the Discriminator Name, Discriminator Type and Discriminator Value settings. If you set the Discriminator setting on the root class, LightSpeed will default the name and type for you. Code: Apply DiscriminatorAttribute to the derived class, specifying the discriminator attribute name and the value that identifies this class.
127
there are various trade-offs associated with the STI pattern, the most obvious being the classic time/space tradeoff. The STI pattern trades space for time by storing the data in one table, querying and persisting the data becomes inherently simple and efficient. However, for hierarchies where the entities have different attributes, the underlying table may become sparsely populated. That said, most modern database systems are reasonably good at optimizing unused table space. ### TODO: picture of hierarchy and database mapping ### You dont need to do anything special to specify single table inheritance: if you follow the steps above, single table inheritance is what you will get.
Class table inheritance is set up in much the same way as single table inheritance: you must provide a discriminator column on the table corresponding to the root class, and specify a discriminator attribute and value on each derived class. In addition, you must specify class table inheritance: Designer: For each inheritance arrow in the hierarchy, set Inheritance Type to ClassTableInheritance.
128
Code: Apply InheritanceMappingAttribute to the root class of the hierarchy, with an InheritanceMappingKind of ClassTableInheritance
129
Value Objects
Many attributes of an entity can be represented by primitive values such as integers or strings. However, its sometimes useful to represent an attribute by a type with more specific meaning. Such an attribute might be a single column, for example representing a Salary column by a Money type instead of Decimal, or it might be multiple columns, for example representing LocationX and LocationY columns by a Point or Position type instead of a two separate Doubles. In these examples, Money, Point and Position are value objects. They are not entities, because they do not have identity of their own: they are just a way of representing entity attributes in a more business-meaningful way. Value objects are idiomatic in domain-driven design and are discussed in detail in Eric Evans book of the same name.
Notice that a value object type does not have a base class or identity type, because a value object represents an attribute of an entity, and does not have identity in itself.
130
whether a value object type suitable for mapping these columns is already defined, the designer will offer to map the columns to an existing value object type, to add the columns to an existing value object type, or to create a new value object type (and a member of that type).
131
The reason for this is that value objects represent values. Suppose you have an Employee entity with a Salary of $50000. If the Employee gets a raise, then their Salary changes to a new value of $60000. It would be wrong to think that $50000 has mutated into $60000. $60000 is a different value from $50000, so it must be a different instance. (Remember, value objects dont have identity.) Even if you created a mutable value object, you could not set properties this way, because value objects dont have access to the Entity.Set method which is essential for notifying LightSpeed of changes needing saving not to mention for UI interfaces such as IEditableObject and INotifyPropertyChanged. So when you set a value object property, you must always set it to a new instance of the value type. Setting value object properties
employee.Salary = new Money(50000); hq.Location = new Position(-41.289, 174.777);
A consequence of this is that when using data binding or data grids you must provide a way to edit your custom value types. In the Location example, the entity has a single Location, which will appear as a single column in a data grid. That column will need to display and allow editing of the Position value, but the editor will need to keep producing new Position objects. You cannot, for example, provide two text boxes, one bound to Location.X and one bound to Location.Y.
132
For example, suppose that Position.X were mapped to Longitude and Position.Y to Latitude. Then the Site Location property would be mapped to the LocationLongitude and LocationLatitude columns in the database. You can map the column name prefix associated with an occurrence of a value object by setting Column Name Prefix on the connector in the designer, or by applying ColumnAttribute to the value object reference in code. For example, suppose that Site.Location were mapped to the Coordinates prefix. Then LightSpeed would map this to CoordinatesX and CoordinatesY columns in the database. The following table shows how modifiers on a field in a value object (such as Position.X) and modifiers on an occurrence of a value object (such as Site.Location) affect the column name mapping. Position.X No modifier Column Latitude Site.Location No modifier LocationX LocationLatitude Prefix Coordinates CoordinatesX CoordinatesLatitude
In legacy databases, the columns of a value object may not share a common prefix, or there may not be a consistent set of suffixes across different sets of columns that youd like to map to the same value object type. In this case, you can map individual fields of the value object at the occurrence level using ValueObjectColumnAttribute. At the time of writing, this is not supported in the designer and can only be used on hand-coded members. Mapping value object columns to an irregular database schema
public class Site : Entity<int> { [ValueObjectColumn("X", "XPos")] [ValueObjectColumn("Y", "YPos")] private Position _location; [ValueObjectColumn("X", "Easting")] [ValueObjectColumn("Y", "Northing")] private Position _surveyCoordinates; }
133
134
135
To open the LightSpeed Model Explorer, open the Visual Studio View menu and choose Other Windows > LightSpeed Model.
136
137
workflow, and of course it is pretty much mandatory if you are working with a legacy database that may be maintained by another team in the organisation.
138
exist in the model you may be offered the choice of creating a new value object type, mapping the columns to an existing value object type or moving the selected column(s) into an existing value object property of the entity. See Value Objects in the Domain Modelling Techniques chapter for more information. Hide simple through entities. LightSpeed represents many-to-many associations as through associations, which are mediated by a through entity. In most cases the through entity has no attributes other than its associations to the entities with the many-to-many association. In this case you can suppress the through entity in the designer by right-clicking the through association and choosing Convert to Auto Through Entity. If you find you need to add properties or attributes to the through entity, you can reverse this by choosing Convert to Explicit Through Entity. See Many-toMany Associations below for more information about modelling many-to-many associations.
139
140
Custom mappings are primarily intended for existing databases where the data format is already established. When creating new databases, prefer to use standard LightSpeed conventions, and where its appropriate to use a domain type such as Money, consider mapping it using a value object rather than a custom mapping.
141
Extract Interface
Entity
142
Custom Views
Models, especially large models, can be hard to navigate. LightSpeed provides several options to help you visualise your model more easily.
Filtering
You can filter your view to show only selected entities. This can be useful for locating the part of the model youre interested in or for visualising model behaviour.
To filter the model, open the LightSpeed Model Explorer, select how you want to filter the view and enter the string to be matched. For the convenience of keyboard users, you can also use a prefix character on the filter string instead of clicking the Filter by option. Filter By Name Prefix (none) Effect Shows only entities whose names contain the filter string. The string is treated as a regular expression, so you can perform simple pattern matching. Shows only entities with a matching tag. Entities with no tags at all are always shown. Tags are a convenient way of labelling a related set of entities. You can set tags using the Tags property. An entity can have more than one tag. Shows only entities which are part of the named aggregate specified by the filter string. An entity is part of the named aggregate if it has an association on which that aggregate is specified. Inverts the filter: that is, entities which match the filter are hidden instead of shown. (This can be specified only through a prefix, not through the Filter By drop-down.)
Tag
Aggregate
You can optionally show additional entities which, although they dont match the filter themselves, are related to entities that do. This help you to see the filtered entities in context. Show Option Show selected Effect Shows only entities which match the filter.
143
Show associated
Show inheritance
Also shows entities which have an association (one to many, one to one or through association) with the entities which match the filter. Only one level of association is followed; indirectly associated entities are not shown. Also shows entities which are eager loaded by an entity which matches the filter. The full eager load graph is shown, not just immediately associated entities. Only always eager load associations are considered; optionally loaded associations via named aggregates are not shown (but can be shown using the @ filter). Also shows the base and derived classes of the entities which match the filter. Sibling classes are not shown: if you need to understand the full hierarchy, enter a filter which matches the root of the hierarchy.
QuickViews
To save the current filter, right-click the model and choose View > Save Current as QuickView. To reload a saved filter, right-click the model, choose the View submenu and select the name of the saved view. Note that LightSpeed saves and re-applies the filter definition: if the model has changed, the result of the filter may be different from before. You can rename, edit and delete QuickViews via the Quick Views folder in the LightSpeed Model Explorer. The View submenu also contains commands to quickly filter by tag, if any tags are defined in the model, and to remove all filtering and show the entire model.
144
Linked Models
Another solution to the problem of large, complex models is to replace a single model file with multiple linked model files. Linked model files work well when your domain consists of a number of distinct subdomains, with relatively few links across subdomain boundaries. Dont use linked model files if you have a lot of associations across subdomain boundaries cross-file associations require quite a bit of maintenance and if you have too many of them you may find they are more trouble than they are worth! To link a set of model files, you must do two things: Set the Name of each model to the same value. (If you are specifying a namespace in the model, then this must be the same for all of the linked models too.) Choose one of the models to be the main model. For all of the other models, set Is Linked Child to True.
If you use the Paste as Link command to create entity links, or the Refactor > Split Model At Association command to split an existing model into two files, it will set these properties for you.
Code Generation
Each linked model file in a set is code-generated separately. The code generation is done in such a way that the generated files combine to produce a single model using partial classes. For example, you will end up with a single strong-typed unit of work class, with queryable properties for all of the entity types, although the implementation for this class will be spread across three files. Because LightSpeed depends on partial classes to combine code, all linked model files must be in the same project.
145
Set the entitys Name to PurchaseOrder and Is Link to True. Click OneToManyAssociation in the Toolbox and drag an arrow from PurchaseOrder to Shipment. Save both models and rebuild your project.
146
The defaults policy affects only newly created entities, not existing entities. If you create an entity by dragging a table from Server Explorer, LightSpeed uses the table definition to infer the identity type and storage options, ignoring the defaults policy, though the base class is still respected.
147
148
From this point on, whenever the model changes, the code will be regenerated using your copies of the templates. There are a couple of maintenance considerations for custom templates: 1. Visual Studio wont automatically regenerate code when you change the template. Changes to the templates will only take effect next time you edit your model. If you dont actually need to do anything to the model, just move something and move it back again that will be enough to trigger regeneration. 2. We occasionally ship updates to the default templates, to reflect new features or options, or to fix bugs. If these updates are relevant to you, youll want to fold them into your custom templates. Its therefore a good idea to keep a copy of the Mindscape version our custom templates are based on around. That way, when we update the templates, you can use a diff and merge tool to find the changes between the Mindscape versions and merge them into your files (or to merge your diffs from the baseline onto the new baseline).
Once you have defined an extension property, it appears in the properties grid for every element of the kinds you specified in Extends, and you can enter values for it as if it were a built-in property.
149
To use an extension property in a template, there are three methods which you can call from the template: HasExtendedProperty(name): Returns true if the user set a value for the named extension property on the element at hand, or false if the user did not set a value (or the extension property does not apply to this kind of element). GetExtendedPropertyValue(name): Returns the value set by the user for the named extension property on the element at hand, and throws an exception if the user did not set a value. You should always call HasExtendedProperty before calling this method. GetExtendedPropertyValue(name, defaultTo): Returns the value set by the user for the named extension property on the element at hand, or the defaultTo value if the user did not set a value (or the extension property does not apply to this kind of element). You can safely call this without calling HasExtendedProperty.
The value returned from GetExtendedPropertyValue is the actual value of the property. This may not be suitable for emitting into the generated code directly. For example, if GetExtendedPropertyValue returns a string, and you want to emit that string into a DisplayNameAttribute declaration, you will usually want to quote that string. Or if it returns an enum value, you will usually want to emit it qualified with the enum type name.
[DisplayName(Full Name)] [DisplayName("Full Name")] [Priority(High)] [Priority(Priority.High)] // would be an error // correct // would be an error // correct
150
To convert a literal value to a code fragment for that literal value, call $Translator.TranslatePrimitive from your custom template. Generating DisplayNameAttribute from the DisplayName extension property
#if ($field.HasExtendedProperty("DisplayName")) [DisplayName($Translator.TranslatePrimitive($field.GetExtendedPropertyValue("DisplayName")))] #end
Using TranslatePrimitive ensures that values are converted to literals in a way which is correct for both the value and the target programming language. (Of course, if the extension property is intended to be used in a VTL expression such as a #if test, or if it represented a member name such as a property in a strongly-typed resource class, then you will not want to quote it. This is one of the reasons why GetExtendedPropertyValue returns a value rather than a code fragment.)
You must also add the paths to the designer assemblies to your projects Reference Paths collection (Project > Properties > Reference Paths). Alternatively, you can specify the path in the assembly directives. The designer assemblies are not redistributable and you should not add them to your project. T4 needs them to process the template but you do not need them at run time. You must then specify the file extension for the generated file, using the output directive:
<#@ output extension=".txt" #>
Finally you must specify the LightSpeed model from which to generate code, using the LightSpeedModel directive. The processor attribute is always LightSpeedModelDirectiveProcessor;
151
the requires attribute should specify fileName='lsmodel_file'. The following directive hooks the T4 template up to a LightSpeed model named Sample.lsmodel:
// Line breaks added for clarity <#@ LightSpeedModel processor="LightSpeedModelDirectiveProcessor" requires="fileName='Sample.lsmodel'" #>
You can now write T4 code as normal. The LightSpeed model is available through the this.Model reference.
<# foreach (Entity entity in this.Model.Entities) { #> // Emit entity code here <# } #>
The designer object model is not documented so you will need to ask in the LightSpeed forum or use the Visual Studio Object Browser to determine the programmatic names of metamodel classes and properties (though they usually correspond to the display names shown in the toolbox and property grid).
152
Many-to-Many Associations
As discussed in the chapter Creating Domain Models, LightSpeed represents many-to-many associations as through associations. A through association between A and B is implemented using a through entity, which represents an association between one A entity and one B entity. A given A entity may be associated with multiple through entities, each of which links on to one B entity, and vice versa. ### TODO: a picture would probably help here ### The designer provides two ways of presenting through associations. You can choose to show just the many-to-many relationship, treating the through entity as an internal implementation detail, which is the most convenient approach in most cases. Or you can choose to show the through entity explicitly, which provides you with fine control and extensibility at the expense of visual clutter.
If you have an explicit through entity and its not adding any value, you can convert it to an auto through entity by right-clicking the through association and choosing Convert to Auto Through Entity.
153
entity which represents the contributiontag association. With an explicit through entity this is easy because you can work with the through entity just like any other entity. To specify an explicit through entity, select the through association arrow and choose the through entity from the Through Entity drop-down.
When you show an explicit through entity, you must also explicitly model the one-to-many associations between the main entities and the through entity. This allows you to control details such as foreign key column mapping. If you have an auto through entity and you need to add extra data or fine-tune the database mapping, you can convert it to an explicit through entity by right-clicking the through association and choosing Convert to Explicit Through Entity.
154
As in C#, you can use the question mark suffix to make the property nullable (e.g. int? Height). Using the Ins key and inline type editing, you can easily enter multiple properties without taking your hands away from the keyboard. This can be much quicker than using the mouse when entering a lot of properties.
Custom Attributes
You can apply custom attributes to generated properties using the various Custom Attributes collections. You can set custom attributes on entities and properties via the Custom Attributes option, and for associations you can set them on either end of the association and the foreign key (via options such as Collection Custom Attributes and Backreference Id Custom Attributes). Custom attributes are applied to the wrapper property, not the backing field. Most LightSpeed attributes have to go on the backing field, so you cant use custom attributes to apply LightSpeed attributes its usually more convenient to use the designer equivalents anyway. Rather, they are intended for attributes consumed by other frameworks, such as BrowsableAttribute or DisplayNameAttribute. When you enter an attribute in the Custom Attributes dialog, you must fully qualify the attribute name, e.g. System.ComponentModel.Browsable. If youre applying a lot of attributes from the same namespace, you can get around this by adding the namespace to the Imported Namespaces list (via the LightSpeed Model Explorer).
155
If you want only a subset of your model, select the elements you want to include in the image before copying. If you are filtering the view, the copied image will show only the elements that are included in the current filter.
Rearranging Properties
To rearrange entries in an entitys properties list for example, to alphabetise them or to group related items together right-click a property and choose Move Up or Move Down.
156
The alternative approach is to define the base class in code, create an External Class Reference to that class via the LightSpeed Model Explorer, and set each entitys Base Class to the external reference via the Properties window. This avoids lots of inheritance arrows, but will result in a warning that the external class is being excluded each time you sync to the database.
Changing Property or Entity Names When Other Code Already Uses Them
If you want to change the name of a property or entity in the domain model, but you have lots of code that already uses the existing name, and you dont want to change the database schema either, you can use the Refactor > Rename command to help you out. Right-click the property or entity and choose Refactor > Rename. Enter the new name and make sure that Keep existing name as database column/table name is ticked. LightSpeed will update all references to the property or entity in your code, and create a mapping between the renamed element and the existing database name.
157
158
159
Many standard .NET methods and properties are built into the LightSpeed provider. However, your database may provide SQL functions that dont have a .NET equivalent, or you may have created user-defined functions that you want to use in queries. In this case, you can register a mapping between a .NET method and a SQL function. Once this is done, you can use the .NET method in a query, and LightSpeed will translate it to the specified function. To register a mapping between a .NET method and a SQL function, call ServerFunctionDescriptor.Register. You can register a member method or an extension method: this allows you to create methods on existing classes purely to have something to map to SQL. For example, suppose you wanted to call SQL Servers (admittedly antiquated) DIFFERENCE function, which returns how similar two strings sound. There is no String method that maps naturally to DIFFERENCE, but you can define and register an extension method:
160
Once a method is mapped, you can use it in a LINQ query just as if it were built into LightSpeed: Using a mapped method
from t in UnitOfWork.Towns orderby t.Name.SimilarityTo("radavleetsa") descending select t;
The resulting SQL calls the SQL function to which the .NET method was mapped:
SELECT ... FROM Town ORDER BY DIFFERENCE(Name, @p0) DESC
161
CHARINDEX function. CHARINDEX requires the string to be looked for as the first argument, not the string to look in. ServerFunctionDescriptor.Register provides an overload which takes an implicit argument index. If you use this, the expression to which the .NET method is applied the implicit argument will appear at that (0-based) index in the SQL functions argument list.
ServerFunctionDescriptor.Register(indexOfMethod, "CHARINDEX", 1);
As with mapped functions in LINQ, the expression by default becomes the first argument, but you can override this by specifying an implicit argument index. Also as with LINQ mappings, you can specify that the function should be called using member syntax by prefixing it with a dot.
Entity.Attribute("UserName").Function(1, "CHARINDEX", " ") > 0; Entity.Attribute("Location").Function(".STDistance", searchLocation) < distance;
162
Projections
You can use the Projection collection to load a specific set of columns (or computed expressions). When doing this you must call the IUnitOfWork.Project method instead of IUnitOfWork.Find. The results are not materialised into entities and do not become part of the unit of work. You can access the results via an ADO.NET IDataReader for raw access, or have them materialised into data objects using the IUnitOfWork.Project<T> overload. The latter is similar to LINQ projections using the Select operator. When performing a projection, you can also set the Distinct property to deduplicate the results. Examples: If you want to perform a single column projection which returns a primitive type or a string then you can use the Project<T> overload as shown below: Projecting a single column and returning a list of Int32s
var query = new Query(typeof(Contribution)); query.Projection.Add("ContributorId"); var results = unitOfWork.Project<int>(query);
To deduplicate the results in the query above, we would use the Distinct property as shown below:
163
If we want to project more than one column to a known type, that type needs to be declared with properties that match the name of the projected columns we are going to be returning. An example with two properties is shown below but the same approach applies regardless of how many columns are involved in the projection. Projecting multiple columns and return a list of a specified type
struct MyProjectionResult { public int ContributorId { get; set; } public int ApprovedById { get; set; } } var query = new Query(typeof(Contribution)); query.Projection.Add("ContributorId"); query.Projection.Add("ApprovedById"); var results = unitOfWork.Project<MyProjectionResult>(query);
If you require a custom mapping to be applied or dont require object instances to be returned to you then you can use the Project() method which will return you a DataReader instance. Projecting and returning a DataReader instance
var query = new Query(typeof(Contribution)); query.Projection.Add("ContributorId"); query.Projection.Add("ApprovedById"); using (var reader = unitOfWork.Project(query)) { while (reader.Read()) { // do work } }
164
Views
Specify the ViewName property to run the query against a view instead of against the entitys normal backing table. See Working with Database Views in the chapter Controlling the Database Mapping for details.
Controlling Aliasing
Aliasing for a LightSpeed query is specified through the Mappings property. The Mappings object allows you to specify a mapping between a type or query and a name. This alias is then either used automatically when LightSpeed builds the query, or you can use it when building up a query or a join to indicate which alias should be used when building the query. Managing aliasing manually should only be required for very precise control over queries as LightSpeed will generate the required aliasing automatically for all queries. Manually specifying aliasing for a query
var query = new Query(typeof(Contribution)); query.Mappings.Add<Contribution>("t0"); query.Mappings.Add<Member>("t1");
One word of caution when manually specifying aliasing for a query is that the Mappings collection is used by LightSpeed to understand all of the types involved in the query, so if a mapping is added in to the collection but not subsequently used then a CROSS JOIN will be applied to join that table in to the query to ensure that any arbitrary criteria that may reference that aliased entry can be resolved.
Joins
Joins allow you to define expressions that will allow multiple entities to participate in a query. A Join as defined as part of a query is directly translated into a SQL join when LightSpeed translates a query into the underlying SQL statements required for the query. LightSpeed allows you to define three types of joins: Inner: The intersection between the two entity sets as intersecting on the supplied keys Outer: All rows in the left hand entity set paired with corresponding rows in the right hand entity set or null if no corresponding join can be made. This conforms to the syntax of a LEFT OUTER JOIN in SQL. CrossJoin: The Cartesian product of the two entity sets.
To specify a join, you must assign an instance of the Join class to the Join property. Multiple joins can be specified by use of the .And() method which is available on any Join instance. LightSpeed offers several static methods to allow for easy instantiation of Join instances according to the type required and allows for generic specification of the two entity types involved.
165
For more advanced scenarios such as joining against a sub-query there are additional overloads available. If you are interested in such scenarios please review the API documentation for the static Join instantiation methods named Inner() and Outer(). When performing a join, LightSpeed will automatically opt in all fields from the joined entity or query unless you have explicitly defined a projection for the query. The rational for this is that you are surfacing entities rather than individual columns. If your intention is to only make use of data from specific fields then we would strongly recommend you declare a set of appropriate projections to match the data you intend to use. Review the documentation in the Projections section in this document for more information on how to achieve this. Examples: To specify an Inner Join you can use the Inner<TLeft, TRight>() method as shown below: Specifying an inner join between two entity types
var query = new Query(typeof(Contribution)); query.Join = Join.Inner<Contribution, Member>("ContributorId", "Id");
To specify an Outer Join you can use the Outer<TLeft, TRight>() method as shown below: Specifying an outer join between two entity types
var query = new Query(typeof(Contribution)); query.Join = Join.Outer<Contribution, Member>("ApprovedById", "Id");
To specify a Cross Join you can use the CrossJoin<TLeft, TRight>() method as shown below: Specifying a cross join between two entity types
var query = new Query(typeof(Contribution)); query.Join = Join.CrossJoin<Contribution, Member>();
166
If you need to specify multiple joins, you can use the .And() method to chain your join instances together. An example of this is shown below: Specifying multiple joins using Join.And()
var query = new Query(typeof(Contribution)); query.Join = Join.CrossJoin<Contribution, Member>().And( Join.Inner<Contribution, Comment>("Id", "ContributionId") );
In advanced scenarios you may require LightSpeed to join against a sub-query, for example to surface specific projected fields from the sub-query while using it to filter or scope the top level query. The .Inner() and .Outer() methods provide an overload which allows you to join on another Query instance which will perform a join against that sub query of the type you have nominated. LightSpeed does not currently offer this with CrossJoins and these overloads are not available on the generic versions of these methods. An example of how this might be used is shown below: Specifying a join between an entity type and a query
var innerQuery = new Query(typeof(Member), Entity.Attribute("Username") == "jb"); var query = new Query(typeof(Contribution)); query.Join = Join.Outer(typeof(Contribution), innerQuery, "ContributorId", "Id");
Grouping
The Grouping property allows you to declare a grouping which will be applied to the query which will be ultimately translated into a SQL GROUP BY statement. A grouping is specified by assigning a Group instance to the Group property on the query object. The Group class provides a static method for instantiating Group instances and the Group class provides the .AndBy() method which allows you to specify multiple grouping columns. If you are performing a grouping query you will always need to obtain your results using a call to Project or Project<T> as LightSpeed will by default only select back the columns which are being grouped on. You can alternatively specify additional columns to be selected by explicitly declaring a projection set for the query, note that if you do so you need to ensure you include the columns being grouped on as this will override the default behaviour. Specifying a Projection set also allows you to add aggregates into the query. LINQ translations of grouping statements will typically perform two underlying calls, one to perform the grouping query and then a second batch to load the entities involved in the query to assign them into the corresponding resulting grouped collections allowing further client side projections to take place against those entities.
167
Examples: To perform a standard grouping operation where the result will be a single column of the grouping key you can specify this as shown below: A basic grouping operation
var query = new Query(typeof(Contribution)); query.Group = Group.By("ContributorId"); var results = unitOfWork.Project<int>(query);
Alternatively you can specify a Projection set and use the Group.BySelection() method to indicate that LightSpeed should group by every column in the projection set. You need to ensure that this will generate a valid query for your target database, for example aggregates will not be supported by most database providers. Asking LightSpeed to group by the selection defined in your Projection set
struct MyGroupingResult { public int ContributorId { get; set; } public int ApprovedById { get; set; } } var query = new Query(typeof(Contribution)); query.Group = Group.BySelection(); query.Projection.Add("ContributorId"); query.Projection.Add("ApprovedById"); var results = unitOfWork.Project<MyGroupingResult>(query);
Alternatively you can use the .AndBy() method to chain your grouping expressions together, so to rewrite the above query using this approach would look like this: Using AndBy to chain grouping expressions
struct MyGroupingResult { public int ContributorId { get; set; } public int ApprovedById { get; set; } } var query = new Query(typeof(Contribution)); query.Group = Group.By("ContributorId").AndBy("ApprovedById"); var results = unitOfWork.Project<MyGroupingResult>(query);
168
Finally you can return aggregates as part of the result set by declaring these in your Projection set while independently specifying the grouping keys using Group instances. An example of this is shown below: Returning Aggregates as part of your projected result set
struct MyGroupingResult { public int ContributorId { get; set; } public int Views { get; set; } } var query = new Query(typeof(Contribution)); query.Group = Group.By("ContributorId"); query.Projection.Add("ContributorId"); query.Projection.Add(Entity.Attribute("Views").Function("SUM")); var results = unitOfWork.Project<MyGroupingResult>(query);
Subexpressions
Subexpressions allow you to define common expressions that can be referenced by name in the query. See Subexpressions below for details.
Hints
The Hints property allows you to pass index or table hints to databases where this is supported. See the Performance and Tuning chapter for more information.
169
Subexpressions
When youre writing a query, you may find that an operation over an associated collection comes up repeatedly. For example, you may want to perform a calculation over the collection, and both order and filter by the result of this calculation. To avoid retyping the common expressions each time and having the database engine re-evaluate them each time you can define subexpressions for them, then use these subexpressions as if they were actual attributes of the entity. To define a subexpression, add it to the Query.Subexpressions collection. You must specify the name of the subexpression, the query expression it encapsulates, and the column in the target table on which to join to the main table. Defining a subexpression
query.Subexpressions.Add( "TotalFreight", Entity.Attribute("Orders.Freight").Function("SUM"), "CustomerId");
Once a subexpression is defined, you can use it by specifying its name. You can use a subexpression name anywhere you would normally use a field name typically in a query expression, sort order or projection.
170
In LINQ, you can create sub expressions using the let keyword; no special APIs are required.
171
172
Note that an EntityInfo represents a class, not an individual entity instance. Similarly, a FieldInfo represents a class member, not an instance of the field on a particular entity instance.
173
Remember that the Entityinfo represents the metadata for the type of the entity: it is not specific to the entity instance. You can also get the metadata for an entity type without having an entity instance by using the EntityInfo.FromType static method: Getting the metadata from an entity type
var classInfo = EntityInfo.FromType(typeof(Order));
174
The FieldInfo and AssociationInfo classes have a number of properties relating to the field or association definition. For example, you can get the data type using the FieldType property, or the cardinality of association using the AssociationType property. Again, see the API documentation for information about these properties.
175
GetValue accepts any entity, but throws an exception if the entity is not of the type on which the FieldInfo is defined. This can be an issue when inheritance is in play: if you are processing entities from across the inheritance hierarchy, then you will probably be using a flattened collection to ensure you process all fields, but a particular entity instance wont have the fields declared in sibling or derived classes. To check if the FieldInfo is defined on the entity, call IsDefined: Using IsDefined to check that it is safe to get a field value
var classInfo = EntityInfo.FromType(typeof(SalesItem)); // Not defined on all SalesItem subclasses var publisherField = classInfo.FlattenedFields.First(f => f.PropertyName == "Publisher"); if (publisherField.IsDefined(salesItem)) { object publisher = publisher.GetField(salesItem); }
FieldInfo also provides a TryGetValue method which combines the check with the retrieval, but using IsDefined and GetValue is usually more convenient if you want to cast the value to a particular type.
176
As with GetValue, you can use IsDefined to check that the field exists on the entity instance at hand, or TrySetValue to combine the check with the set. When you call FieldInfo.SetValue, you are directly setting the underlying CLR field. (Remember, the LightSpeed metamodel represents fields, not properties.) The property setter is not called. If you have custom logic in the property setter, for example to validate or transform input values, FieldInfo.SetValue bypasses that custom logic.
177
178
This is a safe, conservative strategy which avoids superfluous loads. It is referred to as lazy loading because it lazily doesnt do any work it doesnt need to. However, when a unit of work needs a number of related entities, it can be imperfectly efficient. In the code fragment above, LightSpeed issued three database queries. If we knew ahead of time that the application was going to need the Customer entity and the Lines collection, it would have been more performant to perform all three queries in a single go. The problem gets worse when dealing with collections of entities. For example:
var overdues = unitOfWork.Orders.Where(o => o.DueDate < today); foreach (var o in overdues) // queries the database for Orders { var customer = o.Customer; // queries the database for a Customer Console.WriteLine("Order {0} for customer {1} is overdue", o.Reference, customer.Name); }
This code fragment results in one query for the orders, then, for each order, another query for the customer. If the first query returned 1000 orders, then we would do 1000 queries for customers a total of 1001 queries. This problem is called the N+1 problem because it means you need N+1 queries to process N objects. LightSpeed makes it easy to avoid these problems using eager loading.
Eager Loading
Eager loading means loading associated entities before they are needed. In LightSpeed, eager loading specifically means loading associated entities using the same database query as the source entity, avoiding an extra round trip. Here is how our N+1 code fragment works with eager loading.
179
var overdues = unitOfWork.Orders.Where(o => o.DueDate < today); foreach (var o in overdues) // queries the database for Orders *and* Customers { var customer = o.Customer; // Customer is already loaded, no database query Console.WriteLine("Order {0} for customer {1} is overdue", o.Reference, customer.Name); }
By using eager loading we replace 1001 trips to the database to a single trip! Of course, that one trip now has to do more work, so you should eager load only when you know you are going to need the associations. To implement eager loading, select an association that you want to be eager loaded, and: If you want parent entities to eager load their children, set Eager Load Collection to True. If you want child entities to eager load their parents, set Eager Load Backreference to True.
In the example above, because you want the Order entity to eager load its associated Customer, you would select the Order to Customer association and set Eager Load Backreference to True. If you also wanted the Customer entity to eager load its collection of Orders, you would also set Eager Load Collection to True. (An association can be eager loaded in both directions.) For hand-coded associations, you implement eager loading by applying the EagerLoad attribute to the association field: Implementing eager loading on hand-coded associations
public class Order : Entity<int> { [EagerLoad] private readonly EntityHolder<Customer> _customer = new EntityHolder<Customer>(); [EagerLoad] private readonly EntityCollection<Line> _customer = new EntityCollection<Line>(); }
Eager loading cascades, allowing you to load a whole object graph in a single database round-trip. For example, if Customer.Orders is eager loaded, and Order.Lines is eager loaded, then when you load one or more Customers, LightSpeed will automatically fetch not only the orders for all those customers but also the lines for all those orders.
180
the Lines collection but it might want to eager load the customers so that it could show the user who each overdue order belonged to. To support conditionally eager loading an association, LightSpeed uses named aggregates. Aggregate is a term from domain-driven design that refers to a bounded object graph that we need to satisfy a particular application use case. (In the example above, there were two aggregates. The order details page was interested in an order plus lines aggregate. The overdue orders page was interested in an order plus customer aggregate.) Named aggregates allow you to name particular object graphs, and tell LightSpeed to eager load different aggregates in different situations. To specify that an association is part of a named aggregate, select the association arrow, and enter the aggregate name in the Collection Aggregates or Backreference Aggregates box (or both). An association can be part of multiple named aggregates separate the aggregate names with semicolons. For hand-coded associations, specify the AggregateName property on the EagerLoad attribute. You can specify the attribute multiple times. Specifying that a hand-coded association is part of a named aggregate
public class Order : Entity<int> { [EagerLoad(AggregateName = "WithAllDetails")] private readonly EntityHolder<Customer> _customer = new EntityHolder<Customer>(); [EagerLoad(AggregateName = "WithAllDetails")] [EagerLoad(AggregateName = "WithLines")] private readonly EntityCollection<Line> _customer = new EntityCollection<Line>(); }
To eager load a particular aggregate using LINQ, specify it in your query using the WithAggregate() method: Eager loading a named aggregate using LINQ
// Loads orders and associated lines, but not associated customers var orders = unitOfWork.Orders .Where(o => o.CustomerId == customerId) .WithAggregate("WithLines") .ToList();
To eager load a particular aggregate using query objects, specify it in using the Query.AggregateName property:
181
In both of the above samples, changing WithLines to WithAllDetails would eager load both the lines and the customers. Omitting the aggregate would mean both the lines and the customers were lazy loaded. The aggregate name is an attribute of a query, and affects all associations in the aggregate, not just the associations on the starting object. For example, if both Customer.Orders and Order.Lines are marked with the WithAllDetails aggregate, then loading a Customer with the WithAllDetails aggregate will load the customers orders, and all the lines of those orders. This allows you to load deep object graphs in a single database access if required.
182
The marked field will now be loaded only when you access it:
using { var var var } (IUnitOfWork unitOfWork = _context.CreateUnitOfWork()) employee = unitOfWork.FindById<Employee>(1); // does not load security photo name = employee.Name; // already loaded, no database query photo = employee.SecurityPhoto; // queries database to load SecurityPhoto column
183
However, if you supply one of the aggregates that the field is part of, using the WithAggregate method or the Query.AggregateName property, then the field will be eager-loaded as if it were a normal field:
using (StaffUnitOfWork unitOfWork = _context.CreateUnitOfWork()) { var employee = unitOfWork.Employees. .WithAggregate("WithSecurityBlobs") .Single(e => e.Id == 1); // loads security photo var photo = employee.SecurityPhoto; // already loaded, no database query }
The aggregate name is an attribute of a query, and affects all fields in the aggregate, not just the fields on the starting object. For example, if Site.Employees is eager loaded or is marked with the WithSecurityBlobs aggregate, then loading a Site with the WithSecurityBlobs attribute will load all the employees on that site, including their SecurityPhoto fields.
184
Visualising Aggregates
You can use the designer to help you visualise named aggregates. See Custom Views in the chapter Working with Models in the Visual Designer.
185
Bulk Updates
IUnitOfWork.Update takes a query specifying which rows to update, and an object or dictionary specifying the values to update those rows with. If you pass an object, then LightSpeed uses the names and values of its properties to compose the update. If you pass a dictionary, then LightSpeed uses the string keys of the dictionary and their associated values. The following example uses an anonymous type to update the IsBadPaymentRisk column. Performing a bulk update using a change object
DateTime overdueDate = DateTime.Now.AddMonths(-6); Query overdueAccounts = new Query(typeof(Customer), Entity.Attribute("Balance") < 0 && Entity.Attribute("LastPaymentDate") < overdueDate); unitOfWork.Update(overdueAccounts, new { IsBadPaymentRisk = true }); unitOfWork.SaveChanges();
The following example uses a dictionary to perform the same update. Performing a bulk update using a change dictionary
DateTime overdueDate = DateTime.Now.AddMonths(-6); Query overdueAccounts = new Query(typeof(Customer), Entity.Attribute("Balance") < 0 && Entity.Attribute("LastPaymentDate") < overdueDate); Dictionary<string, object> changes = new Dictionary<string, object>(); changes.Add("IsBadPaymentRisk", true); unitOfWork.Update(overdueAccounts, changes); unitOfWork.SaveChanges();
186
Note that like entity operations, bulk operations are not applied to the database immediately. You must still call SaveChanges. This allows bulk updates to be carried out atomically and to be coordinated with entity operations.
Bulk Deletes
IUnitOfWork.Remove takes a query specifying which rows to delete. Performing a bulk delete
DateTime expiryDate = DateTime.Now.AddMonths(-6); Query expiredUsers = new Query(typeof(User), Entity.Attribute("LastActiveDate") < expiryDate); unitOfWork.Remove(expiredUsers); unitOfWork.SaveChanges();
As noted under Bulk Updates, you must remember to call SaveChanges to commit the bulk delete.
187
Batching
When you call IUnitOfWork.SaveChanges, LightSpeed flushes all pending changes inserts, updates and deletes to the database. To reduce the number of round-trips to the database, LightSpeed collects these pending changes into batches for submission into the database. Note that LightSpeed always commits changes under the aegis of a transaction. Therefore, even if there are more SQL statements than will fit in one batch, you are still guaranteed that the save will be atomic.
Increasing the batch size will reduce the number of database round-trips during a large save, but results in much longer SQL commands. Increasing batch size too far can therefore reduce performance instead of improving it. If you make a change, be sure to measure the impact on a system whose configuration (speed, throughput, latency) is as similar as possible to your production system. Particularly in low latency environments, reducing round trips may be a poor trade off.
188
Caching
LightSpeed provides two mechanisms for caching of entity instances a first level and second level cache. Both caches are designed to help in reducing the number of database queries that need to be executed. The first level cache, which is also known as the Identity Map is always in use, will cover all entities and cannot be disabled; However the second level caching must be explicitly enabled and configured for use as it is disabled by default and only extends to caching entities which you have explicitly opted in through use of the Cached attribute. First level caching is scoped to the current unit of work and therefore any cached objects will be unavailable as cached objects inside a second unit of work. The second level cache provide application wide caching beyond the bounds of a unit of work. An example of how the first and second level caches will be hit
using (var unitOfWork = context.CreateUnitOfWork()) { var contrib1 = unitOfWork.FindById<Contribution>(52); // This request will be returned from the level one cache in memory var contrib2 = unitOfWork.FindById<Contribution>(52); } using (var unitOfWork = context.CreateUnitOfWork()) { // If level two cache is not enabled, this would generate // a database call. With the level two cache enabled it would // be fetched from the cache as we fetched it in the previous // unit of work var contrib1 = unitOfWork.FindById<Contribution>(52); }
189
MemcachedCache Uses the popular memcached distributed cache. Memcached can be distributed over an arbitrary number of servers and is suitable for high-load scenarios.
Additionally you can implement your own cache provider by implementing the ICache interface. To enable second level caching, you can either do this programmatically or through configuration as shown below: Enabling Second Level Caching programatically
context.Cache = new CacheBroker(new DefaultCache());
Alternatively it can also be applied using the LightSpeed designer by selecting an entity and setting the cache to be enabled. Cache expiry times can also be configured with the designer if desired.
190
When LightSpeed loads an entity decorated with the CachedAttribute it will immediately add it to the second-level cache. Any subsequent request for that entity by its id will cause a cache hit and the database load will be avoided.
Note: If you are dealing with projections then no cache hits will be possible, even if the projection contains the entity identifier.
Caching Considerations
Caching can sound like a wonderful way to easily improve performance of your application however it is not a silver bullet. Primary concerns to consider is cache item expiry and if stale data could potentially impact your system. Cache expiry is a complex topic and worth reading more about if you are seriously considering second level caching options for a production application. Consider why you are trying to cache data more aggressively and if LightSpeed caching is the best solution. For example, with web applications it may be far more efficient to enable output caching on the web pages. The initial load of a page may cause several database hits however subsequent requests will not cause any queries to fire and will not invoke the page creation process with .NET.
191
Database Hints
When you send a query to a database, the database engine works out how to execute that query using various heuristics. This usually results in an extremely efficient plan. However, sometimes you can work out a better plan based on your knowledge of the database or of the broader application context. For example, you might have found through testing that the database engine is not using an index which could improve performance. Or you might know that it doesnt matter if a particular query reads rows that are in the process of being written, meaning the database engine can skip the safety overhead of a lock. In such situations, some databases allow you to pass hints to the database engine on how to execute the query. This section shows how to specify these hints on LightSpeed queries. In all cases, its important to remember that hints are exactly that hints. The query planner doesnt have to obey them. You are providing advice on how to execute the query: the database engine is free to overrule that advice.
Index Hints
An index hint advises the database to use a particular index. You can pass an index hint using the WithIndexHint operator, providing the name of the index you want to use. Providing an index hint
var orders = unitOfWork.Orders .WithIndexHint("IX_OrderDueDate") .Where(o => o.DueDate < today) .ToList();
You can pass multiple index names to WithIndexHint if required. At the time of writing, index hints are supported on Oracle and SQL Server. Using an index hint on another database does not cause an error since hints are advisory anyway but is ignored.
Table Hints
A table hint advises the database about the way the query uses the table being queried. Table hints can be used for a variety of tasks. For example, the SQL Server NOLOCK table hint indicates that the table need not be locked during the query, trading improved performance for the risk of dirty reads. You can pass a table hint using the WithTableHint operator.
192
Table hints are database-specific LightSpeed simply passes the raw hint text to the database. At the time of writing, table hints are supported only on SQL Server. Using a table hint on another database does not cause an error since hints are advisory anyway but is ignored.
193
Measuring Performance
Although you can use standard .NET tools such as profilers to measure performance in LightSpeed applications, the LightSpeed logging interface allows you to carry out simple measurements of query performance. For each database access, you can have LightSpeed log the SQL sent to the database, and how long it took for the database to respond. You can view this information through the console or Visual Studio Output window, or capture or analyse it by processing CommandLog records in a custom logger. See Logging in the Testing and Debugging chapter for more information.
194
195
Entity Tracking
Storing Creation and Update Times
To have LightSpeed automatically store the time when entities of a particular type are created and updated, select the entity and set Track Create Time and/or Track Update Time to true. The Track Create Time option creates a field named CreatedOn. LightSpeed automatically populates this field when the entity is first saved. You can access this field normally and use it in queries. The Track Update Time option creates a field named UpdatedOn. LightSpeed automatically populates this field whenever the entity is saved. As with CreatedOn, you can access this field normally and use it in queries. The database must contain the corresponding backing columns for whichever options are selected. If you use Update Database or Create Migration in the designer, it will create these columns for you. If you create these columns manually, they must be non-nullable columns of date-time type.
196
Soft Deletion
By default, when you delete an entity through LightSpeed, it is deleted from the database that is, LightSpeed sends a SQL DELETE statement. However, you can specify that entities of a particular type should be soft deleted that is, instead of being deleted, they should be left in the database but marked with a deleted flag. To have LightSpeed use soft deletion for an entity type, select the entity and set its Soft Delete option to true. The Soft Delete option creates a field named DeletedOn. The DeletedOn field is a nullable DateTime, which is null by default, but which LightSpeed automatically populates when the entity is deleted. The database must contain a corresponding DeletedOn column. If you use Update Database or Create Migration in the designer, it will create this column for you. If you create the column manually, it must be a nullable column of date-time type.
197
IncludeDeleted does not limit the query to deleted entities it will return both deleted and non-deleted entities. You can use the DeletedOn field to distinguish deleted entities. LightSpeed does not provide a way to purge or restore soft deleted entities. These are database administration tasks and should be handled as such.
198
In both cases the time comes from the clock of the machine where LightSpeed is running. If clocks are not well synchronised then the errors will be reflected in the stored timestamps.
199
200
201
202
Concurrent Editing
LightSpeed uses optimistic concurrency: that is, it allows users to edit an entity without taking a lock, optimistically assuming that it is okay for users to edit the same entity at the same time. When each user commits their changes, their version of the entity is persisted. LightSpeed doesnt check whether there has been a concurrent edit. This is effectively a last one wins policy. LightSpeed also supports concurrency checking, which results in an exception being thrown if a user tries to save an entity when that entity has been modified since the user first loaded it. This is effectively a first one wins policy. To enable concurrency checking for entities of a particular type, select the entity and set Optimistic Concurrency Checking to true. The Optimistic Concurrency Checking option creates a field named LockVersion. LightSpeed automatically increments this field each time the entity is saved, and verifies that the version of the entity in the database is still the version that was originally loaded. If not, it means that somebody else has changed the entity in the meantime, and LightSpeed raises an OptimisticConcurrencyException. The database must contain the corresponding backing column for the LockVersion field. If you use Update Database or Create Migration in the designer, it will create this column for you. If you create the column manually, it must be a non-nullable column of integer type.
203
Special fields should normally be marked readonly as they should be modified only by LightSpeed, not by application code. This will cause a compiler warning because there is no way to set the fields: you should disable this warning as shown above. ASP.NET medium trust environments do not allow LightSpeed to modify readonly fields: if you plan to use your model in such an environment, implement the special fields as read-write.
204
205
206
In addition, you will need to tell LightSpeed to use stored procedures to load child collections in associations. To do this, select the association arrow and fill in the Select Procedure option.
207
Delete Procedure: Deletes a single entity. The procedure should take a single parameter, whose name is the Identity Column Name for the entity, and whose value is the entity Id to delete.
You can also define an access procedure on an association, for looking up child collections: Select Procedure: Selects all entities of the child type whose foreign key is equal to the parameter value. The procedure should take a single parameter, whose name is the column name of the foreign key of the association, and whose value is the Id of the parent.
CRUD procedures must also adhere to any database-specific conventions for LightSpeed stored procedures. This specifically affects Oracle databases. See Working with Database Providers for details.
208
209
Implementing GeneratedId
public partial class Product : Entity<string> { [Transient] private string _productCode; // Must be transient protected override object GeneratedId() { return _productCode; } }
If the _productCode field were not transient, LightSpeed would try to map it to a ProductCode column in the database. If there was no ProductCode column, this would cause an error. If ProductCode was the natural key (Id) column, then that column would now be mapped twice, once to Id and once to _productCode also an error. (If there was a ProductCode column, independent of the natural key, then there would be no error, but this is unlikely, as it would mean the product key was stored in two places. However, something like this could happen if the natural key were derived from other columns, but not identical to those columns.)
210
The identity type is automatically generated. Its name is the entity name followed by Id. For example, an entity named InternationalProduct will have an identity type named InternationalProductId. You can create composite keys in a model-first approach, by using the LightSpeed Model Explorer to add identity properties to an entity. You should not normally do this: if you are defining new entities rather than mapping existing tables, you should use a surrogate key instead.
211
You can then specify this type as the TId type parameter when deriving from Entity<TId>: Declaring a composite keyed entity
public class InternationalProduct : Entity<InternationalProductId>
212
If you are using the designer, it can generate this code for you, but it may not be able to infer the column mappings from the database. You may therefore need to manually set up the association and map columns.
213
214
215
Custom Wrappers
As previously noted, LightSpeed maps .NET fields to database columns, and ignores .NET properties. This means you are free to define wrapper properties or methods with different types from the underlying field. These wrappers can convert between the domain type, which they present in the public interface of the entity, and the database representation, which is encapsulated in the private fields. To implement this, select the field you want to wrap and set its Generation option to FieldOnly. Then open or create a partial class file for the entity, and implement the wrapper property or methods yourself. Surfacing a database string as a domain Boolean property
partial class Member { // _isModerator string field is in designer generated code public bool IsModerator { get { return _isModerator == "Y"; } set { string dbIsModerator = value ? "Y" : "N"; Set(ref _isModerator, dbIsModerator, "IsModerator"); } } }
Remember to use the Entity.Set method to set the value of persistent fields. A crucial pitfall of this approach is that the translation logic is not applied to queries. Users must write queries in terms of the underlying database format, not the domain format. This is confusing when using the query objects syntax, and is not possible at all in LINQ:
216
Entity.Attribute("IsModerator") == "Y" Entity.Attribute("IsModerator") == true from m uow.Members where m.IsModerator == "Y" select m
// works // fails
Therefore, if you use a custom wrapper, and anticipate users wanting to query on the wrapped field, be sure to encapsulate all querying logic behind a repository API which can perform the necessary translations.
Field Converters
A field converter encapsulates the translation between database and domain as a reusable object. This is useful for two reasons: first, it allows that translation to be applied to queries; and second, it allows the same translation to be applied to many different fields without needing to write a custom wrapper for each. There are also specialised cases which are problematic for custom mapping, such as LightSpeed 3-style data transfer objects, or TimeSpan fields where LightSpeeds built-in handling may clash with a desired database-specific mapping. To implement a field converter, implement the IFieldConverter interface, or derive from the FieldConverter<TDatabase, TModel> base class. Converting a database string to a Boolean value
public class YesNoConverter : FieldConverter<string, bool> { protected override bool ConvertFromDatabase(string databaseValue) { return "Y".Equals(databaseValue, StringComparison.OrdinalIgnoreCase); } protected override string ConvertToDatabase(bool value) { return value ? "Y" : "N"; } }
The LightSpeed designer represents columns requiring conversion using a user-defined type. For example, you might represent the Y/N Boolean idiom as a YesNo type. This saves you applying the converter separately to each field that needs it instead, the user-defined type bundles up the domain type, the database type and the mapping between them. Note that the user-defined type will still surface in the domain model as (in this case) a Boolean. To create a user-defined type that represents a custom format, open the LightSpeed Model Explorer, right-click the root model node, choose Add New User Defined Type, and apply the following settings:
217
Name: The name to appear in the Data Type drop-down, e.g. YesNo. CLR Type Name: How this type should be surfaced in the domain model, e.g. System.Boolean. This should be a namespace-qualified CLR type name (see Enums and Other User-Defined Types in the chapter Working with Models in the Visual Designer for more information). Is Value Type: Set this to True or False depending on whether the CLR type is a struct or a class. Data Type: How this type is represented in the database, e.g. String. Converter Type: The name of the class which converts between the database representation and the CLR type, e.g. MyProject.YesNoConverter. Is Standard Data Type: You can usually leave this as True. However, if the storage format uses a database-specific type such as the MySQL time type, set it to False, and enter the database type name in the Database Type Name box.
218
Once you have defined a user-defined type to represent a custom storage format, you can use it just like any other data type, by selecting it from the Data Type drop down:
LightSpeed will now generate a boolean property, but map it to a string column. Furthermore, now that the user-defined type has been set up, you can apply it to as many properties as you want, which makes it very convenient for mapping patterns that are used widely across a database.
219
The YesNoConverter is applied to the comparand true, resulting in SQL such as:
SELECT ... FROM Horse WHERE IsFullOfGreeks = 'Y'
This enables users of the entity to write queries in domain terms, and to use LINQ without suffering compiler type errors.
Suppose now you wrote the LINQ query where o.SomeIntBoolField == true. The converter translates the comparand true to the database value 1. So LightSpeed would emit the SQL clause WHERE SomeIntBoolField == 1. This would not match values such as 2 or -1, even though these are meant to be considered true. You would therefore need to be careful to write the query where o.SomeIntBoolField != false, as this would translate to WHERE SomeIntBoolField <> 0. This is only an issue if your mapping is not one-to-one.
220
221
DB2
LINQ Support
The ToString() method is not supported on DB2. The coalesce (??) operator cannot be used with strings on DB2. Some grouping operations fail on DB2.
Tools Support
LightSpeed offers only limited support for migrations on DB2.
222
Firebird
General
Although Firebird capability still exists in LightSpeed 4, it is no longer actively maintained. New features are not tested with Firebird, and Firebird support may be removed in a future version of LightSpeed. In general, features beyond those described in the Basic Operations chapter, such as joins and groups, may not be implemented for Firebird.
Tools Support
The designer does not currently support dragging tables or views from Firebird. Migrations are not supported on Firebird.
223
MySQL
Designer Support
In order to drag tables and views onto the designer, you will need MySQL Connector/Net. This is a free download from the MySQL Web site. LightSpeed has not been tested with third-party Visual Studio/MySQL integration add-ins please contact us if you encounter problems when using such add-ins. Although MySQL Connector/Net is capable of displaying stored procedures, at the time of writing it does not support dragging stored procedures from Server Explorer. This is a limitation of MySQL Connector/Net, not of LightSpeed. You can still add stored procedures to your model using the Toolbox, or invoke them by constructing a ProcedureQuery object in code.
224
Oracle
Using Oracle Stored Procedures with LightSpeed
Oracle stored procedures return row set results through parameters. In order to locate the row set, LightSpeed adopts the convention that the results should be returned through an out parameter of type ref cursor named results.
Designer Support
Some third-party add-ins which integrate Oracle into Visual Studio (such as the Devart DotConnect add-in) use a connection string format which is not understood by LightSpeed. If you encounter problems when using such an add-in, create a new Server Explorer connection using the .NET Framework Data Provider for Oracle or the Oracle Data Provider for .NET.
225
PostgreSQL
Designer Support
At the time of writing, the free PostgreSQL driver, Npgsql, does not provide Visual Studio integration to enable drag and drop onto the LightSpeed designer. If you want to import PostgreSQL tables into the designer, there are a number of options: Mindscape has created a free add-in which allows you to display PostgreSQL databases in Server Explorer and drag them onto the designer. Please post in the support forum if you would like to use this add-in. Some commercial PostgreSQL libraries include Visual Studio integration. These have not been tested with LightSpeed but some at least are known to work. Please post in the support forum if you need advice on using a specific commercial library with the LightSpeed designer. You can use the lsgen.exe command line tool with the /l:lsmodel option to generate a .lsmodel file from your database.
226
SimpleDB
Connection String Format
When using Amazon SimpleDB, LightSpeed expects a connection string in the following format: Access Key=your_access_key;Secret Access Key=your_secret_access_key You can also specify the following additional connection string options: Url: By default, LightSpeed will connect to Amazons default SimpleDB endpoint (US East). Provide a Url element in the connection string to connect to another SimpleDB endpoint, such as US West, Europe or Singapore, or to a compatible store such as a local M/DB instance. Consistent Read: By default, LightSpeed uses standard (eventually-consistent) reads. To perform consistent reads, add the option Consistent Read=true to the connection string. Enable Batch Insert: By default, LightSpeed performs inserts one at a time. To enable batching of inserts, add the option Enable Batch Insert=true to the connection string.
Limitations
Amazon SimpleDB supports only a limited subset of the features of traditional relational databases. Consequently, the following limitations apply when using LightSpeed on SimpleDB: No eager loading. Eager loads are silently translated into lazy loads and no warning is given. No cross-table queries. A query of the form Entity.Attribute(Contributor.UserName) == jd will not work. No batching of updates and deletes. Each operation is a separate request to SimpleDB. At present, LightSpeed makes no attempt to perform these operations in parallel, so a large number of updates may take a long time. Insert batching must be turned on explicitly in the connection string. No queries involving server-side operations, e.g. SQL functions such as SUM or LOWER or LINQ equivalents such as Sum() or ToLower().
227
To override the consistent-read setting on a per-query basis, use the WithProviderOptions method and pass a suitable AmazonSimpleDBProviderOptions object. (If you are using query objects, set Query.ProviderOptions.) Set ConsistentRead to true to force a consistent read, to false to force a standard read, and to null to use the default behaviour (standard or consistent according to the connection string). Overriding the consistent read setting for a single query
var forceConsistent = new AmazonSimpleDBProviderOptions { ConsistentRead = true } var products = unitOfWork.Products .WithProviderOptions(forceConsistent);
Data Storage
SimpleDB has no concept of data types: everything is treated as a string. The most important impact of this is on sorting: SimpleDB always performs lexical comparisons. For example, 45 sorts before 123. This is usually not desirable for numeric values. Therefore, when LightSpeed stores data in SimpleDB, it always stores it in a sortable format. Numeric values are padded with leading zeroes to the maximum length of the numeric type, e.g. 0000012345. DateTime values are stored using the ISO 8601 sortable format, e.g. 2001-0101T01:02:03.004. For maximum efficiency you should therefore use the smallest suitable numeric type (e.g. store a year as a short rather than an int). We also strongly advising using floats or decimals instead of doubles, as a double takes 300+ characters to store! Because negative numbers cannot be range-compared using lexical ordering, you should not use signed types in comparisons unless you are confident that neither the value used in the query, nor any value in SimpleDB, is actually negative.
Tools Support
In order to drag domains onto the designer, you will need Mindscape SimpleDB Management Tools. You can use the free edition if you do not require SimpleDB management capability. Because SimpleDB treats all data as strings, the designer always infers the string type. Schema round-tripping is not currently available for SimpleDB. Migrations are not supported on SimpleDB.
228
SQLite
There are no known special considerations for SQLite.
229
SQL Server
SQL Server 2008 Spatial Data Types
To enable spatial data types at runtime, choose the SqlServer2008 DataProvider in your LightSpeedContext configuration. If you choose this provider, you must distribute Microsoft.SqlServer.Types.dll with you application, even if you do not use the spatial data types. You do not need to do anything to enable spatial data types at design time. The designer does not distinguish between SQL Server 2005 and SQL Server 2008. It will show SQL Server 2005 regardless of which version you are using, but you will still be able to use spatial data types.
### TODO: talk about how to map spatial data functions for use in queries ###
230
LINQ Support
Some string operations (notably concatenations with ToString()) fail on SQL Server Compact. The coalesce (??) operator cannot be used with strings on SQL Server Compact. There are several engine limitations which can affect complex LINQ queries, specifically around joining, grouping, composition (e.g. Intersect), subselects and the Distinct operator.
Tools Support
Migrations have only limited support on SQL Server Compact.
231
VistaDB
Versions
Although VistaDB 3 capability still exists in LightSpeed 4, it is deprecated, and may be removed in a future release. We advise customers using VistaDB 3 to upgrade to VistaDB 4 if possible.
LINQ Support
There are several engine limitations which can affect complex LINQ queries, specifically around joining, grouping, composition (e.g. Intersect), subselects and the Distinct operator.
Tools Support
The designer does not currently support dragging views from VistaDB. Migrations have only limited support on VistaDB.
232
233
Database Migrations
Migrations are a solution to the problem of managing database versions. A migration captures the set of actions required to update a database from one version to the nextwhich might include schema changes such as widening or adding columns and/or data changes such as computing defaults for new columnsand to roll that same database back to its original state. Capturing migrations as artifacts makes it easier and more reliable to apply the changes to multiple physical databases (development, test, staging and production) with high confidence that the same set of changes is being applied each timebecause it worked in staging, it will work in production. Being able to roll migrations backwards as well as forwards is important for acceptance testing and staging environments. For example, if staging fails, you will want to revert it to its previous status. But you may not want to rebuild the entire environment from scratch. Running migrations in reverse enables this. LightSpeeds migrations framework provides you with: A way to specify migrations in code using a database-agnostic API and the full flexibility of the .NET Framework. A tool that creates migrations for you as you make changes to your LightSpeed model. Command-line and GUI tools for running migrations, or for generating SQL scripts that can be run later by a DBA.
234
Creating Migrations
Writing Migration Code
A migration is implemented as a class which derives from Migration. The actions to take are specified in the class overrides of the Migration.Up and Migration.Down methods. The Migration class provides a number of methods that you can call to specify the actions to take in the migration. This section lists a few of the key methods; see the Migration class documentation for specific overloads, more details and other methods. Method AddTable DropTable RenameTable AddColumn AddForeignKeyColumn DropColumn ChangeColumn RenameColumn ExecuteNativeCommand Action Adds a new table to the database Drops a table from the database Changes the name of an existing table Adds a column to a table Adds a column with a foreign key constraint to a table Drops a column from a table Change the data type or nullability of a column Changes the name of a column Typical SQL Equivalent CREATE TABLE DROP TABLE (varies) ALTER TABLE ADD ALTER TABLE ADD ALTER TABLE DROP ALTER TABLE ALTER (varies)
Executes arbitrary SQL. You can use this to (as specified) make changes that are not supported through the migrations API, but it may make your migration code non-portable across database providers.
(SQL equivalents are approximate; the exact SQL syntax varies from database to database.)
235
Choose a project and click OK, or click the New Project button. If you create a new project, LightSpeed prompts for the project file. Create a new folder and enter the desired project file name.
Dont use the same migrations project for multiple models. To create a migration in the migration project, select the Migrations menu and choose Create Migration. LightSpeed displays the actions required in the Up method to migrate the database from the last migration to the current model design, and the actions required in the Down method to reverse the process. You can de-select any actions that you dont want generated for you. Enter a migration name and optionally a description. The migration name will be the name of the generated class and must be a valid class name in the migration project language. Click OK. LightSpeed generates and displays the migration class. You can now edit this class if required.
LightSpeed stores snapshots of your model in the migrations project, and uses these to work out what has changed. Do not move, delete or rename these snapshots!
Data Types
You will normally specify data types using the ModelDataType class. This abstracts your code from the database representation. For example, you would write ModelDataType.String rather than VARCHAR, NVARCHAR or VARCHAR2. When you require fine control over data types, you can instead specify a literal string. For example, if you want a string to be represented as a fixed-size field, you could specify CHAR instead of ModelDataType.String. Such data types are database specific.
Identity Generation
The Migration class also provides APIs to create the database resources used by the LightSpeed identity generation methods. See the AddKeyTable API.
236
Running Migrations
Running Migrations from Visual Studio
To run migrations from Visual Studio: Open your model. From the Migrations menu, choose Run Migrations. The Migrations window is displayed. The Migrations window shows the available migrations and the current database version. (If you see a migrations project does not build message, open the Errors window and fix the problems, then click Refresh to try again.) Check that the Method is set to Apply to connected database and that the database and connection string are correct. If not, choose Change to change your settings. (See below for more info. You will always need to do this the first time you run a given migration project.) Select the target version that youd like to migrate the database to. Select Apply Now.
LightSpeed will figure out whether to run the Up or Down migrations depending on the current database version. Before you first run a migration from Visual Studio, you need to choose the target database. To do this: Choose the Change button next to the database connection information. LightSpeed displays the Migration Settings wizard. Choose Apply migrations directly to a database and click Next. Either select a Server Explorer connection, or choose a database type and enter a connection string. Click Finish.
If you subsequently change the migration target, to run the migrations against a different database, choose the Change button again and re-run the wizard.
237
to_version: The version to which to migrate the database. If omitted, the latest version will be selected.
LightSpeed will figure out whether to run the Up or Down migrations depending on the current database version.
If you want to monitor migration progress, for example to provide an interactive display or to log SQL for diagnostic purposes, use Migrator.CreateMigrator to create a Migrator instance, then call Migrator.Execute to run the migrations. You can set the MigrationLogger property to log migrations, and handle the MigrationExecuting and MigrationExecuted events to provide interactivity. Note that the migrations assembly has several dependencies which are not part of the normal LightSpeed redistributable.
238
LightSpeed generates the SQL script and displays it in Visual Studio. You can save this SQL script to the folder of your choice.
If you subsequently need to change the migration settings, choose the Change button again and rerun the wizard.
239
Database Support
Migrations are not supported on Amazon SimpleDB. Some databases do not support all migration actions. For example, SQLite does not support adding a foreign key column to an existing table. In addition, some features are not currently implemented on DB2, SQL Server Compact Edition and VistaDB.
240
Appendices
241
Configuration Reference
The following configuration settings are available through the LightSpeedContext class and the configuration file. Property (Configuration Attribute)
AuditInfoMode (auditInfo) AutoTimestampMode (autoTimestamps) Cache (cacheClass) CascadeDeletes (cascadeDeletes) CommandTimeout (commandTimeout) ConnectionString (connectionStringName) CustomAuditInfoStrategy CustomAutoTimestampStrategy DataProvider (dataProvider) DetectPropertyNames (detectPropertyNames) DisplayNamingStrategy
Description
How to generate user names for CreatedBy, UpdatedBy, etc. How to generate timestamps for CreatedOn, UpdatedOn, etc. Cache implementation for cached entities Whether delete operations cascade by default How long LightSpeed should allow commands to run before failing them. (Config entry is in seconds.) The database connection string. (Config entry refers to the <connectionStrings> section.) IAuditInfoStrategy for user names for CreatedBy, UpdatedBy, etc. IAutoTimestampStrategy for timestamps for CreatedOn, UpdatedOn, etc. The type of database. Enables the Entity.Set overload which does not take a property name. True by default. IDisplayNamingStrategy for localising property names for display in validation messages. IEntityFactory to be used when materialising entities. When using the KeyTable identity method, the number of Ids to reserve per database query. When using the Sequence or MultiSequence method, the increment amount of the sequence. How LightSpeed assigns Ids to entities. Additional configuration for the IdentityMethod
More Information
Implementing Storage Policies in LightSpeed Implementing Storage Policies in LightSpeed Performance and Tuning Basic Operations
Basic Operations
Implementing Storage Policies in LightSpeed Implementing Storage Policies in LightSpeed Basic Operations
242
Description
ILogger for SQL and debug logging INamingStrategy for database mapping.
More Information
Testing and Debugging Controlling the Database Mapping
Whether table names in the database use the Controlling the Database plural or singular form of the entity class Mapping name (e.g. Person table or People table). Whether to quote identifiers (e.g. table Basic Operations names) in generated SQL. Avoids conflicts with SQL reserved words, but can cause case sensitivity issues on some databases. The default database schema (can be overridden on a per-entity basis). The type of full-text search engine, if any. Location of full text search index file(s). Maximum number of update statements per command batch. Advanced Querying Techniques Advanced Querying Techniques Performance and Tuning
Runs LightSpeed in a mode that is compatible Building Web Applications with ASP.NET medium trust, at the expense with LightSpeed of some performance. Shows additional detail in log displays. Testing and Debugging
VerboseLogging
243
Partial Classes
Use partial classes to add behaviour to generated entity classes. You can also add properties in partial classes, but if those properties introduce additional state (as opposed to being wrapper or adapter properties around the existing LightSpeed fields), be sure to mark the backing fields with the TransientAttribute so LightSpeed doesnt try to persist them. Never use automatic properties in a LightSpeed entity class the C# compiler generates a backing field which doesnt map to a database column, and you cant get at the field to mark it transient.
244
the task at hand. You can also apply this to fields, for example eager-loading a large binary only if the with high-resolution picture aggregate is specified in the query.
ObjectDisposedException in SystemTransactionCompletedEvent
Some customers have reported experiencing ObjectDisposedException when disposing a system TransactionScope than encapsulated two nested units of work. This appears to be a bug in the .NET
245
Framework running on Windows XP. The solution is to use a single unit of work, or to dispose the first unit of work before creating the second unit of work.
246
lsgen also supports the following optional switches: /linq: Generate a strong-typed LINQ unit of work class. /contracts: Generate WCF data contracts. /m:model_name : Specifies the model name for LINQ or WCF class names /include:tables : Include only the specified tables in the generated model. /exclude:tables : Exclude the specified tables from the generated model. /stp:prefix : The prefix to be stripped from table names if present (e.g. tbl). /src:prefix : The prefix to be stripped from column names if present (e.g. col). /template:template_file : Use a custom code generation template.
The tables argument to the /include and /exclude switches is either a comma-separated list of table names, or the @ character followed by the name of a file containing the list of tables, one per line. E.g. /include:Cars,Engines or /exclude:@CarTables.txt. Associations are generated regardless of whether the association target type is generated, so care is required to avoid dangling associations. Individual columns can be excluded using table.column notation, e.g. MyTable.UnwantedColumn, or *.column to exclude columns of that name no matter which table they are in. To create a designer model using lsgen, specify /l:lsmodel. In this case, the output destination (/o) must be a file, not a folder. lsgen does not generate any layout for the generated model: it is up to you to make the diagram look good by loading it into Visual Studio and dragging the entities into a suitable layout. You can also generate C# or Visual Basic files from a LightSpeed designer model using lsgen. This is not normally necessary because Visual Studio automatically generates the appropriate code, but it
247
may be useful in certain automation scenarios. To do this, specify /p:lsmodel, and pass the model (.lsmodel) filename as the /c (connection string) option.
248
249
If you need to use one of these operators, you must materialise the query results using AsEnumerable or ToList. You can then process it using LINQ to Objects, which does support the operators mentioned above. For example: Using LINQ to Objects operators with a LightSpeed query
// Runtime error: can't translate TakeWhile to SQL var millionaires = unitOfWork.Employees .OrderByDescending(e => e.Salary) .TakeWhile(e => e.Salary >= 1000000); // Success: translates first two lines to SQL, then runs TakeWhile on the results var millionaires = unitOfWork.Employees .OrderByDescending(e => e.Salary) .AsEnumerable() // subsequent operators run client-side .TakeWhile(e => e.Salary >= 1000000);
250
251
There is only partial support for the Concat, Intersect, Union and Except operators. These can be used to combine entity sets of the same type, but support for other scenarios (such as projections from multiple tables) is very limited. Many databases do not support the Intersect operator at all.
252
Further Reading
The following resources are not specific to LightSpeed, but provide background and more detail on the patterns and techniques that influenced the design of LightSpeed. Eric Evans, Domain Driven Design Martin Fowler, Patterns of Enterprise Application Architecture
253
Index
Add method, 49 adding entities, 49 ADO.NET, 224 ADO.NET transaction, 52 aliasing, 156 application, 70 creating and upgrading the database, 229 association and database first development, 129 creating in code, 25 creating in the designer, 20 custom resolver, 205 eager loading, 170 Get method, 26 mapping to foreign key, 56 metadata for, 165 one-way, 125 properties, 20 reverse, 25 Set method, 26 attributes and fields, 25 auditing, 187 auto through entity, 21 automatic properties, 24 shunned, 235 batch size, 179 batching, 179 performance, 236 BeginTransaction method, 52 block allocation of Ids, 64 bulk operations, 177 cache, 180 bulk operations and, 178 manual access, 182 cascade delete, 50 class table inheritance, 119 code first development, 24 code generation, 139 T4 templates, 142 templates, 139 using designer extensions, 140 column loading computed, 60 mapping, 57 mapping to association, 56 mapping to field, 55 mapping to Id, 55 value object mapping, 123 Column Name option, 57 ColumnAttribute, 57 command line tools, 238 composite key, 202 many-to-many association, 206 concrete table inheritance, 120 concurrency, 194 configuration, 37, 39, 71, 233 file, 39, 71 lightSpeedContexts section, 39 connection string, 40 SimpleDB, 218 ConnectionStrategy class, 76 convention over configuration, 54 Convert to Manual Implementation command, 148 creation time, 187 timestamp, 190 cross join, 157 cross-cutting keys, 205 CRUD, 36 stored procedures for CRUD operations, 198 custom association resolver, 205 custom attributes in the designer, 146 custom mapping data type, 132 custom property setter, 148 data format, 207 data provider, 40 data type conversion, 208 converting in property code, 207 custom mapping, 132 database-specific, 131 designer, 131 spatial, 221 database, 40, 212 converting data, 207 creating in the designer, 16 hints, 183 index, 183 legacy, 196 mapping, 54 migrations, 225 querying, 42 saving changes, 49, 50 synchronising to model, 128 view, 66 database connection and unit of work, 76 custom strategy, 76 database first development, 18, 128 and inheritance, 129 one-to-one association, 129 reorganising domain model, 129
254
value object, 129 database synchronisation configuring, 130 database-defined type, 131 DB2, 213 debugger visualizer, 116 debugging logging, 113 delete bulk delete, 177 deleting entities, 50 cascade, 50 designer, 12, 126 code generation, 139 creating migrations, 226 defaults for creating entities, 138 enum, 131 extensibility, 140 filtering the view, 134 keyboard shortcut, 147 Model Explorer window, 127 PostgreSQL, 217 referencing an external class, 148 tips, 146 toolbox, 12 user-defined type, 131 detecting concurrent edits, 194 discriminator, 118 distinct query, 154 documentation in the designer, 23 Documentation command, 23 domain model, 9 capturing an image of, 146 creating from database, 18, 30 creating in code, 24 creating with the designer, 12 linked files, 136 synchronising to database, 128 viewing large models, 134 eager load viewing load graph in designer, 135 eager loading, 170 entity, 37 adding new, 49 bulk update or delete, 177 classes, 16 composite key, 202 creating from database, 30 creating in code, 24 creating in the designer, 12 defaults for new entity types, 138 deleting, 50 field loading, 174 Get method, 26 IsValid property, 31 loading strategy, 170
loading through stored procedure, 69 mapping to table, 55 mapping to view, 67 metadata for, 165 natural key, 200 OnValidate method, 34 renaming, 133, 148 saving creation and update details, 187 Set method, 25, 26 soft delete, 188 updating, 50 Validate method, 31 Entity.Attribute method, 46 EntityCollection, 25 EntityHolder, 25 enum, 131 extension properties, 140 external class reference, 148 features, 7 field excluding from save, 60 lazy loading, 174 load only, 60 mapping to column, 55 metadata for, 165 Set method, 25 transient, 60 vs. property, 24, 55 field converter, 132, 208 queries and, 210 Find method, 46 Firebird, 214 flush changes to database, 49, 50 foreign key composite, 204 composite overlapping primary key, 205 mapping, 57 mapping to association, 56 naming convention, 56 part of primary key, 205 full text search bulk operations and, 178 function invoking SQL functions, 151 GeneratedId method, 200 Get method, 26 Get Started command, 147 getter custom code in property, 148 custom property getter, 207 grouping using query objects, 158 GUID identity generation, 63 performance tradeoffs, 63 GuidComb, 63 hint, 183 IAssociationResolver interface, 205
255
id identity type, 14 Id, 61 assigning, 61 composite key, 202 mapping to column, 55 natural key, 200 IdColumnName, 57 identity column, 64 batching and, 179 Identity Column Name option, 57 identity generation, 61 block allocation, 64 identity map, 180 identity method, 61 per entity, 62 setting, 61 IdentityBlockSize, 236 for key table, 62 for sequences, 63 sequences and, 236 IdentityMethod enumeration, 61 IDisplayNamingStrategy interface, 74 IFieldConverter interface, 132, 208 ILogger interface, 114 INamingStrategy, 58 index hint, 183 inheritance, 118 choosing a mapping, 120 class table, 119 concrete table, 120 discriminator, 118 external base type, 148 hiding inheritance arrows, 147 moving properties between classes, 129 single table, 118 inner join, 157 InnoDB, 215 inserting entities, 49 intersection, 160 IsValid property, 31 iterative development, 128 IUnitOfWork, 37 IUnitOfWork interface, 46 IUnitOfWork.Add method, 49 IUnitOfWork.Project method, 154 IUnitOfWork.Remove method, 50 IUnitOfWork.SaveChanges method, 49, 50 join using query objects, 156 key table naming convention, 56 keyboard shortcut, 147 KeyTable identity generation, 62 block allocation, 64 keyword escaping, 59
large models, 134 large objects, 174 lazy loading, 174 legacy database, 196 LightSpeed, 4 LightSpeed Model Explorer, 127 LightSpeedContext, 37, 39, 71, 235 scope, 72 lightSpeedContexts section, 71 linked model, 136 LINQ, 42 invoking custom functions, 151 supported features, 241 loading controlling with named aggregates, 171 default, 170 eager, 170 localisation, 35, 74 logging, 113 custom logger, 114 logical delete. See soft delete lsgen, 30, 238 lsmigrate, 239 many to many association, 21 creating in code, 28 many-to-many association composite key, 206 hiding through entity, 130 many-to-many associations designer, 144 mapping, 54 column, 57 default, 55 non-default, 57 table, 57 value object, 123 metadata, 163 and field values, 167 Microsoft SQL Server. See SQL Server Migration class, 226 migrations, 225 creating SQL scripts, 230 running, 228 Model Explorer window, 127 model first development, 16, 128 MyISAM, 215 MySQL, 215 round-tripping policy, 130 N+1 problem, 170 named aggregate, 171 filtering designer view by, 134 named aggregates, 176 lazy loading fields using, 174 visualising, 176 naming convention column, 55 defining your own, 58
256
foreign key, 56 key table, 56 Oracle stored procedure results, 216 primary key, 55 table, 55 natural key, 200 NoReverseAssociationAttribute, 125 ObjectDisposedException, 236 ODP.NET could not load file or assembly, 216 one to many association, 20 one to one association, 21 one-to-one association and database first development, 129 OnValidate method, 34 optimistic concurrency, 194 OptimisticConcurrencyException responding to, 194 Oracle, 216 outer join, 157 paging using LINQ, 44 using query objects, 47 partial class creating, 133 performance, 169 measuring, 185 persistence, 60 load-only fields, 60 transient fields, 60 pluralization, 55 polymorphism, 120 PostgreSQL, 217 primary key composite key, 202 mapping, 57 naming convention, 55 natural key, 200 profiling, 115 Project method, 154 projections using query objects, 154 properties and metadata, 166 property creating in the designer, 14 custom getter and setter, 148, 207 localising property name, 74 renaming, 133, 148 Set method, 25 vs. field, 24, 55 query expression, 46 query object, 46, 154 subexpressions, 161 query objects vs. LINQ, 48 QueryExpression class, 46
querying, 42, 149 query expression, 46 using query objects, 46 quoting identifiers, 59 rapid application development, 128 refactoring, 133 reference data, 125 release notes, 6 reminder note, 147 Removemethod, 50 removing entities, 50 Rename command, 148 renaming entities and fields, 133 reserved word, 59 reverse association, 25 SaveChanges method, 49, 50 saving changes, 49, 50 transactional, 52 schema, 73 migrating database schema, 225 second level cache, 180 sequence identity generation, 62 block allocation, 64 options, 64 sequence per table, 63 Set method, 25, 26 setter custom code in property, 148 custom property setter, 207 SimpleDB, 218 single table inheritance, 118 soft delete, 188 sorting using LINQ, 44 using query objects, 47 spatial data SQL Server 2008, 221 SQL creating migration scripts, 230 invoking SQL functions, 151 logging, 113 timing, 115 SQL keyword escaping, 59 SQL Server, 221 SQL Server 2000, 221 SQL Server Compact, 222 SQLite, 220 stored procedure, 68 CRUD procedures, 198 Oracle, 216 subexpression using query objects, 161 surrogate key, 200 T4 templates, 142 table dragging into model, 18
257
mapping, 57 mapping to entity, 55 table hint, 183 Table Name option, 57 table per concrete class, 120 table per hierarchy, 118 table per subclass, 119 TableAttribute, 57 tag filtering designer view by, 134 through association, 21 auto through entity, 21 creating in code, 28 hiding through entity, 130 timestamp, 190 tips and tricks, 235 tracing, 113 track create time, 187 track update time, 187 transaction, 52 ADO.NET, 52 TransactionScope, 52 union, 160 unit of work, 37 adding entities, 49 bulk operations and, 178 class, 42 database connection, 76 deleting entities, 50 saving changes, 49, 50 short running vs. long running, 235 update bulk update, 177 update batching, 179 Update Database command, 16, 128 Update From Source command, 19, 128 update time, 187 timestamp, 190 UpdateBatchSize, 179, 236 updating entities, 50
user identifying, 192 track creating or updating, 187 track deleting user, 188 user-defined type designer, 131 Validate method, 31 validation, 31 advanced designer options, 32 customising automatic, 33 declarative in code, 34 designer, 31 localising messages, 35, 74 procedural, 34 whole object, 34 validation attributes and fields, 25 value object, 121 and database first development, 121, 129 editing values, 123 mapping, 123 view database, 66, 156 filtering designer view, 134 mapping to entity, 67 querying through, 66 saving designer filter, 135 VistaDB, 223 Visual Basic, 29 visual designer. See designer Visual Studio debugger visualizer, 116 designer, 126 mapping keyboard to designer, 147 migrations project, 226 running migrations from, 228 WithAggregate operator, 172, 176 XML documentation in the designer, 23
258