Delphi XE2 DataSnap 2nd
Delphi XE2 DataSnap 2nd
DataSnap XE2
Development
Essentials
Dr.Bob’s Delphi XE2 DataSnap Development Essentials
Second XE2 Edition, April 2012 for customers of Bob Swart
12. Case Study: Developer Issue Report Tool (DIRT) 2.0 ............................ - 215 -
Database and Data Model............................................................................. - 215 -
User ...................................................................................................... - 216 -
Report.................................................................................................... - 216 -
Comment ............................................................................................... - 217 -
Watch .................................................................................................... - 217 -
DataSnap Server......................................................................................... - 218 -
Login and Authentication .......................................................................... - 222 -
Delphi XE increased the capabilities and support of DataSnap even further, fixing a number
of bugs and issues in Delphi 2010’s implementation of DataSnap (like filter performance
and stateless DataSnap servers). New additions in Delphi XE’s implementation of
DataSnap are support for “heavy” callbacks, the use of almost any TObject as Server
Method parameter type, and three new DataSnap Server Wizards (plus two wizards for a
DataSnap client module).
On Wednesday, February 29th 2012, Update #4 for Delphi XE2 and C++Builder was
released. See also http://cc.embarcadero.com/item/28758 for the new ISO. The update
required an uninstall and re-install of RAD Studio XE2, as mentioned in the release notes
at http://docwiki.embarcadero.com/RADStudio/XE2/en/Release_Notes_for_XE2_Update_4
The Delphi XE2 About Box should report version 16.0.4429.46931 for Delphi XE2 and
C++Builder XE2 Update 4 with Help Update 3 for Delphi XE2 and C++Builder XE2 installed
as well.
Delphi XE2 Update #4 extended the four mobile connectors with two additional proxy
solutions, for FreePascal and iOS 4.2 as well as iOS 5.0. So now we can use Delphi to
produce a DataSnap iOS Client and compile it with FreePascal in Xcode to a native iOS
client. This is what will be demonstrated in the Mobile Connectors section (new in the
second edition of this manual).
Apart from the Mobile Connectors, the biggest new feature added to Delphi XE2 and hence
DataSnap XE2 is the option to compile not just 32-bit Windows but also 64-bit Windows
DataSnap Servers, as well as 32-bit, 64-bit and Mac OS X DataSnap clients, plus iOS apps
for iPad, iPhone and iPod Touch (using Xcode and FPC).
The Mac OS X clients require FireMonkey (the cross-platform application framework for
Delphi XE2), and will be demonstrated in detail here. Compared to VCL Clients, the
FireMonkey Clients do not use any data-aware controls (a concept that doesn’t exist for
FireMonkey applications), so we have to use the so-called LiveBindings in these DataSnap
client applications.
As you will see from the upcoming screenshots, I’m using Windows 7 Professional as
operating system for the examples, plus Windows Server 2008 Web Edition for the
deployment of the DataSnap ISAPI servers.
EMPLOYEE.GDB
The example database EMPLOYEE.GDB contains a number of tables and fields. In order to
get a good overview of what the database tables look like that we’ll be using, here is a
brief overview. Note that the data is outdated at times, using dates early in the 90s (even
before Delphi 1 was released) and currencies for European countries that currently use the
EURO.
COUNTRY
The COUNTRY table defines the currency which is used for a country, so anyone with a
COUNTRY field, can get the corresponding CURRENCY using the COUNTRY table.
Field Name Type Req
0 COUNTRY TWideStringField Yes
1 CURRENCY TWideStringField
CUSTOMER
There are 14 records in the COUNTRY table. Unfortunately, the data is very outdated, as
Italy, France, Germany, Netherlands, Belgium, and Austria are not yet associated with the
EURO currency, but rather with their 10-year old previous currency.
Field Name Type Req
0 CUST_NO TIntegerField Yes
1 CUSTOMER TWideStringField
2 CONTACT_FIRST TWideStringField
3 CONTACT_LAST TWideStringField
4 PHONE_NO TWideStringField
5 ADDRESS_LINE1 TWideStringField
6 ADDRESS_LINE2 TWideStringField
7 CITY TWideStringField
8 STATE_PROVINCE TWideStringField
9 COUNTRY TWideStringField
10 POSTAL_CODE TWideStringField
11 ON_HOLD TWideStringField
There are 15 records in the CUSTOMER table, and the COUNTRY field can be used to link
the CUSTOMER table to the COUNTRY table (to find out the CURRENCY).
SALES
The 33 records from the SALES table are connected to the CUSTOMER table using the
CUST_NO field. The SALES_REP field is a link to the EMP_NO field of the EMPLOYEE table
Field Name Type Req
0 PO_NUMBER TWideStringField Yes
1 CUST_NO TIntegerField
2 SALES_REP TSmallintField
3 ORDER_STATUS TWideStringField
4 ORDER_DATE TSQLTimeStampField
5 SHIP_DATE TSQLTimeStampField
6 DATE_NEEDED TSQLTimeStampField
7 PAID TWideStringField
8 QTY_ORDERED TIntegerField
9 TOTAL_VALUE TFMTBCDField
10 DISCOUNT TFloatField
11 ITEM_TYPE TWideStringField
12 AGED TFloatField
DEPARTMENT
There are 21 records in the DEPARTMENT table. This table is linked to the EMPLOYEE table
using the MNGR_NO field (associated with the EMP_NO field in the EMPLOYEE table), but
also with itself, using the HEAD_DEPT field. Apart from the first record, all records have a
“head department” specified in the HEAD_DEPT field, up to 4 levels deep.
EMPLOYEE
The EMPLOYEE table contains 42 records and links to the DEPARTMENT table (via the
DEPT_NO field) and also to the JOB table using the combined primary key fields
JOB_CODE;JOB_GRADE;JOB_COUNTRY.
Using the JOB_COUNTRY field we could also potentially link to the COUNTRY table,
determining the CURRENCY for the SALARY for example (assuming the SALARY is paid in
the currency for the JOB COUNTRY, that is).
EMPLOYEE_PROJECT
The EMPLOYEE_PROJECT table contains 28 records and combines the EMPLOYEE and
(upcoming) PROJECT tables using the EMP_NO and PROJ_ID fields.
It is a true N:M connection, since some employees work on more than one project (and
several of the 6 projects have more than one employee working on it).
JOB
The JOB table contains 31 records and can be linked to the EMPLOYEE table using the
combined JOB_CODE;JOB_GRADE;JOB_COUNTRY fields. The JOB table can also be linked
to the COUNTRY table using the JOB_COUNTRY fields from the JOB table against the
COUNTRY field of the COUNTRY table.
PROJ_DEPT_BUDGET
The PROJ_DEPT_BUDGET table contains 24 records (for the fiscal years 1994, 1995 and
1996) and is linked to the PROJECT table using the PROJ_ID field, and also to the
DEPARTMENT table using the DEPT_NO field.
PROJECT
The PROJECT table contains 6 records and is used by the EMPLOYEE_PROJECT table,
linking it to EMPLOYEE table.
The TEAM_LEADER field can be used as connection to the EMPLOYEE table (associating the
TEAM_LEADER value with the EMP_NO value). One project does not have a value for the
TEAM_LEADER field, however.
The SALARY_HISTORY table is connected to the EMPLOYEE table using the EMP_NO field.
SALARY_HISTORY
Based on these tables, we can derive the following raw relationships, which can be used as
basis for some of the more complex queries and views later in this courseware manual.
PROJ_DEPT_BUDGET
PROJ_ID --> PROJECT (PROJ_ID)
PROJ_ID + EMP_NO --> EMPLOYEE_PROJECT
EMP_NO --> EMPLOYEE (EMP_NO)
JOB_COUNTRY --> COUNTRY (COUNTRY)
JOB_CODE, JOB_GRADE, JOB_COUNTRY --> JOB
JOB_COUNTRY --> COUNTRY (COUNTRY)
EMP_NO --> SALARY_HISTORY
DEPT_NO --> DEPARTMENT (DEPT_NO)
MNGR_NO --> EMPLOYEE (EMP_NO)
CUSTOMER
COUNTRY --> COUNTRY (COUNTRY)
CUST_NO --> SALES
SALES_REP --> EMPLOYEE (EMP_NO)
And although the data is outdated at times, at least everyone with a copy of Delphi will
have a copy of InterBase and this example EMPLOYEE database.
Delphi XE2 supports three DataSnap wizards in the Object Repository to help you build a
DataSnap Server application. In logical order, these three choices for DataSnap Server
applications are:
- DataSnap Server
- DataSnap WebBroker Application
- DataSnap REST Application
You can see these different choices when you start Delphi XE2 (Enterprise or higher), do
File | New – Other and select the DataSnap Server category of the Object Repository:
Each of these wizards will give you three different choices regarding the actual project
target (as well as other choices, so the number of different types of DataSnap Servers is in
fact quite large).
Since it may not be obvious what the capabilities and limitations are of these different
project types, I will give an overview of the options followed by a summary to help you
determine which DataSnap Server is best suited for your particular needs.
DataSnap Server
The DataSnap Server wizard creates a VCL application that exposes DataSnap services.
This wizard can be compared to the (simpler) one included with Delphi XE and 2010
Enterprise, and produces the type of applications that we typically created by hand in
Delphi 2009 Enterprise.
The VCL Forms application produces a visual form, while the console application runs from
the console, both using a ServerContainer unit for the DataSnap server details and a
Server Methods unit for the implementation details.
The later two are also used by the Service application type, of course. However, the
service application is a bit special compared to the other two application types, since it
produces an application that will be deployed as a Windows Service, which does not
require someone to login to the deployment machine in order for the service to start. The
VCL Forms application and console application will require both a Windows login session,
and someone (or something) to start the application (it can be placed in the Startup
group, but still requires someone to login in order to trigger the startup group).
The service application is a bit harder to test and debug compared to the VCL Forms
application and console application that can just be run and debugged from the IDE.
All three application types can use TCP/IP as well as HTTP or HTTPS as communication
protocol, using DataSnap transport components that use Indy classes for the actual
communication. Note that Delphi XE2 introduces a way to “enable” HTTPS using these
DataSnap transport components, so keep that in mind later when you want to add
authentication or authorization functionality (the TCP/IP and HTTP channels are basically
sent unencrypted, unless you add encryption filters, so eavesdropping or man-in-the-
middle attacks are potentially possible).
In order to use the HTTPS capabilities, a certificate is required, and details of the
certificate can be specified in the DataSnap Server wizard (page 4 of 5).
The second page of the DataSnap Server wizard can be used to select the server features.
This includes a choice for the protocols (TCP/IP, HTTP and/or HTTPS), Authentication and
Authorization (if Authentication is also checked), as well as a Server Methods Class and
Sample Methods (if the Server Methods Class is checked).
New in Delphi XE2 is the option for the wizard to automatically add filters for encryption
(this will actually add two filters: RSA and PC1) and compression (ZLib). Also new in
Delphi XE2 is the option to add JavaScript files (so we can test the DataSnap Server
without having to make any other client), and the Mobile Connectors, which will be
covered in their own section later in this courseware manual.
By default, the TCP/IP protocol is checked as well as the Server Method Class and Sample
Methods options.
Note that the DataSnap Server wizard is the only one (of the three DataSnap wizards) that
produces a DataSnap Server that has the ability to communicate using TCP/IP. The
DataSnap WebBroker application wizard and DataSnap REST application wizard only
produce servers that communicate using HTTP (with the option of HTTPS for ISAPI DLLs).
Since TCP/IP communication is a lot faster than communication over HTTP(S), this is
something to keep in mind when building DataSnap Servers. It's easy to extend even an
existing a DataSnap Server application with TCP/IP capabilities if the application is a VCL
Forms, Console or Service application.
Finally, the Server Methods Class is by default included as well (the wizard will offer us for
the ancestor class a choice between TComponent, TDataModule and TDSServerModule on
the last page). The Sample Methods – only available if you include a Server Methods Class
of course – are the EchoString and ReverseString methods.
The third page of the New DataSnap Server wizard can be used to specify the port
numbers for the selected communication protocols (TCP/IP, HTTP and/or HTTPS). Note
that if you did not select a single protocol, then TCP/IP will be selected automatically for
you. For TCP/IP, HTTP and/or HTTPS you can specify the port, by default set to 211, 8080
and 8081, with buttons to test the port and find an open port.
By default, the TCP/IP port is set to 211, the HTTP port number is set to 8080, and the
HTTPS port number is set to 8081. You can click on buttons to Test the port (to see if no
other process is currently bound to that port), or to find an available open port. Note that
only once process can "listen" to a certain port number at the same time, and if you have
multiple DataSnap servers, then each of them require a unique TCP/IP, HTTP and HTTPS
port to listen to, so choose your port numbers wisely!
If you specify a port number which is already “taken” and in use by another application,
you get a clear error message:
When using HTTPS, you will also see a special page in the New DataSnap Server wizard
that asks for the details of the X.509 certificate that will be used for your HTTPS
connection:
Another way to use HTTPS is to create an ISAPI DLL for either a DataSnap WebBroker
Server or a DataSnap REST Server (in which case the certificate needs to be installed in
IIS to allow a https:// connection to the ISAPI DLL on the web server).
The fourth (or fifth) and possibly last page is used to define the base class for the server
methods ancestor class (note that you ONLY get this page if you selected to have a Server
Method Class. The ancestor type can be TComponent, TDataModule or TDSServerModule.
TComponent is good enough if you only want to expose Server Methods. TDataModule is a
container for non-visual components, and can be used for exposing Server Methods as
well, while TDSServerModule has RTTI enabled for methods from the start, and also
implements the IAppServerFastCall interface methods, so this is the ideal choice if you
want to expose both DataSets (using TDataSetProviders) and server methods.
Note that while a TDSServerModule is a more ready-to-use choice for exposing DataSets
as well as server methods, you can also use a TDataModule to expose server methods by
explicitly wrapping the class inside {$METHODINFO ON} … {$METHODINFO OFF}
directives, which will happen if you choose sample server methods in combination with the
TDataModule:
type
{$METHODINFO ON}
TServerMethods1 = class(TDataModule)
private
{ Private declarations }
public
{ Public declarations }
function EchoString(Value: string): string;
function ReverseString(Value: string): string;
end;
{$METHODINFO OFF}
If you selected the option “JavaScript Files” and/or the option “Mobile Connectors”, then
there will be one more page of the New DataSnap Server wizard, asking you for the
location to store the project. This is caused by the fact that both the JavaScript Files and
Mobile Connectors option require files to be copied over to a directory structure.
Make sure to specify not just a valid directory name here, but a valid identifier (so a name
with no spaces, and also not starting with a digit for example). The name will not only be
used for the directory, but also for the project name itself.
If you’ve specified an invalid name, like “Project New” you will get an error message:
And have to start the New DataSnap Server wizard all over again (that will probably
happen to you only once – after that, you will remember to specify a valid identifier name
here).
The Server Methods unit is the same for all project targets, only depending on your choice
of the base class and the presence of sample server methods. For the TDSServerModule
base class, the definition is as follows:
unit ServerMethodsUnit1;
interface
uses
System.SysUtils, System.Classes, DataSnap.DSServer, DataSnap.DSAuth;
type
TServerMethods1 = class(TDSServerModule)
private
{ Private declarations }
public
{ Public declarations }
function EchoString(Value: string): string;
function ReverseString(Value: string): string;
end;
Note the new scoped unit names with the System. and DataSnap. scope prefix in their
name. This also means that XE2 projects cannot be recompiled using Delphi XE or lower.
The implementation of the TServerMethods1 class for the two sample methods is as
follows:
implementation
uses
System.StrUtils;
{$R *.dfm}
end.
Note that Delphi generates a unit with the name ServerMethodsUnitX and a class with the
name TServerMethodsX where X is a number, but I always rename both the class and the
unit to a more sensible name based on the purpose of the server module. In this case,
there is only a “demo” purpose, so I often call the class MyServerMethods (so the type
becomes TMyServerMethods) and saved the unit in ServerMethodsUnit.
The designer surface of the Server Methods unit is empty, but can be used to place any
non-visual component, including database connection components, data access
components and TDataSetProvider components to expose datasets from the Server
Methods unit.
Note that the ClassGroup property of the Server Methods class is empty by default. It can
be set to System.Classes.TPersistent (for any component), Vcl.Controls.TControl (for VCL
components), FMX.Types.TControl (for FireMonkey for Windows or OS X components), or
FMX_Types.TControl (for FireMonkey for iOS components). For the DataSnap Server, we
can either leave it empty or assign it to System.Classes.TPersistent or
Vcl.Controls.TControl (we can only create VCL DataSnap servers at this time).
It’s possible to share the same ServerMethods units with all DataSnap projects. In that
case, you should save it in a parent directory of the DataSnap project itself, so other
DataSnap projects can share the same ServerMethodsUnit.pas file.
The design area of the Server Container unit contains more components – at least 3
(TDSServer, TDSServerClass and one of the transport components), and at most 11 -
depending on your choices for the communication protocols and the presence of the
Authentication feature, the JavaScript files and the Mobile Connectors.
The TDSServer component is the “engine” of the DataSnap server, and always required.
The TDSServerClass is the component responsible for creating the actual instance of the
Server Module, and determines the Server Class using the OnGetClass event.
procedure TServerContainer1.DSServerClass1GetClass(
DSServerClass: TDSServerClass; var PersistentClass: TPersistentClass);
begin
PersistentClass := ServerMethodsUnit1.TServerMethods1;
end;
Note that you may have to manually change this code, as the generated unit name will be
ServerMethodsUnitX and the class TServerMethodsX where X is a number starting at 1.
But when you change the name of your Server Methods class and save the unit in a
different name, the above code is not adjusted as well, of course.
The TDSServerClass also controls the LifeCycle of the Server Module (using the LifeCycle
property, which can be set to Server, Session, or Invocation – by default set to Session).
The TDSServerClass must be connected to the TDSServer component using the Server
property.
If you checked the Authentication and/or Authorization option, then you received a
TDSAuthenticationManager component. This component contains two event handlers:
OnUserAuthenticate and OnUserAuthorize. By default both event handlers set the valid
property to True, but the included comments (not available in all DataSnap project
targets) explain what needs to be done here:
procedure TServerContainer1.DSAuthenticationManager1UserAuthenticate(
Sender: TObject; const Protocol, Context, User, Password: string;
var valid: Boolean; UserRoles: TStrings);
begin
{ TODO : Validate the client user and password.
If role-based authorization is needed, add role names to the UserRoles parameter }
valid := True;
end;
procedure TServerContainer1.DSAuthenticationManager1UserAuthorize(
Sender: TObject; EventObject: TDSAuthorizeEventObject;
var valid: Boolean);
begin
{ TODO : Authorize a user to execute a method.
Use values from EventObject such as UserName, UserRoles, AuthorizedRoles and
DeniedRoles.
Use DSAuthenticationManager1.Roles to define Authorized and Denied roles
for particular server methods. }
valid := True;
end;
The TDSProxyGenerator component generates a proxy class for REST clients, based on
the meta data from the TDSServerMetaDataProvider. The Writer property of the proxy
generator shows that it can now produce not just Delphi and C++Builder DBX and REST
proxy files, but also – for use with the mobile connectors – C# Silverlight, Java (Android),
Java (BlackBerry) and Objective/C iOS 4.2 REST proxies, plus FPC for iOS 4.2 or 5.0 now.
The Server Container unit is not the same for all three project targets: the DataSnap
Service application uses a different base class and adds a number of extra methods to
allow the application to act as a Windows Service. Apart from that, the DataSnap stand-
alone console application includes a procedure RunDSServer to start the server and stop it
when the user presses Escape. Also, the ServerContainer class is either derived from a
TService or a TDataModule, storing different properties in the DFM file. As a consequence,
it’s not easy to share the same Server Container unit in a DataSnap stand-alone VCL
application, a DataSnap stand-alone Console application and a DataSnap Service
application.
There’s one more remark about the DataSnap Service application that need to be made:
it’s mentioned in the source code, but may be left unnoticed:
program DataSnapServerService;
uses
Vcl.SvcMgr,
ServerMethods in '..\ServerMethods.pas' {MyServerMethods: TDSServerModule},
ServerContainer in '..\ServerContainer.pas' {MyServerContainer: TDataModule};
{$R *.RES}
begin
// Windows 2003 Server requires StartServiceCtrlDispatcher to be
// called before CoRegisterClassObject, which can be called indirectly
// by Application.Initialize. TServiceApplication.DelayInitialize allows
// Application.Initialize to be called from TService.Main (after
// StartServiceCtrlDispatcher has been called).
//
// Delayed initialization of the Application object may affect
// events which then occur prior to initialization, such as
// TService.OnCreate. It is only recommended if the ServiceApplication
// registers a class object with OLE and is intended for use with
// Windows 2003 Server.
//
// Application.DelayInitialize := True;
//
if not Application.DelayInitialize or Application.Installing then
Application.Initialize;
Application.Run;
end.
This is also the case for Windows Server 2008 or in fact any version of Windows after 2003
(including Windows Vista and Windows 7, if you should consider these for deployment).
Personally, I use the following line of code to check for the Windows version (note that this
requires System.SysUtils unit in uses clause):
That will ensure that the DataSnap Service will also be installed with the
Application.DelayInitialize set to True for these even newer Windows versions.
The DataSnap Server wizard can generate three different projects: a stand-alone VCL
Forms application (great for testing and debugging), a stand-alone Console application,
and a Windows Service application (great for deployment). As transport protocol, we can
use fast TCP/IP, HTTP or – if you have an X.509 certificate – HTTPS.
This wizard start by giving us a choice, similar to the other DataSnap application wizards,
between the possible project targets:
- Stand-alone VCL application
- Stand-alone console application
- ISAPI dynamic link library
The first two application types use the new TIdHTTPWebBrokerBridge class, either on the
main form of the VCL application, or created dynamically in the console application. The
TIdHTTPWebBrokerBridge class from the IdHTTPWebBrokerBridge.pas unit has one
important property: the DefaultPort which should be set to the HTTP port that the Indy
HTTP Server will bind itself to. The HTTP Server can be started by setting the Active
property to True (and stopped by setting Active back to False).
The first two application types produce executables that can be run on a server machine,
without need for a web server, although the firewall must explicitly allow incoming
requests at the specified DefaultPort of the TIdHTTPWebBrokerBridge class.
The version of Indy that ships with Delphi XE does not support HTTPS right away (there is
no property or line of code you have to change to ensure the TIdHTTPWebBrokerBridge
uses HTTPS instead of HTTP). However, Delphi XE2 adds the support for HTTPS using
X.509 certificates (the details of which you can enter in the New DataSnap Server wizard
itself, or later at design-time).
The ISAPI dynamic link library produces an ISAPI DLL that can be deployed on Internet
Information Services (IIS). As a consequence, it’s easy to support HTTPS as well as HTTP,
and the port that is used for incoming connections is determined by IIS itself (and firewalls
should be configured accordingly, but that is not a specific concern of the DataSnap
application itself).
The second page of the wizard can be used to specify the port number for the HTTP
communication. By default, this is set to 8080. You can click on buttons to Test the port,
or to find an open port. Note that only once process can "listen" to a certain port number
at the same time, to choose your port numbers wisely.
If you click on the Find Open Port button, then Delphi will try to find and enter a free port
number in the editbox. Note that it will increase this HTTP port number every time you
click on the Find Open Port button (to be honest, I have no idea where this value is stored,
but on one of my machines I’m up to port 49268 right now).
This port number is used by the TIdHTTPWebBrokerBridge class, by the way, but you can
always change it later. In the stand-alone VCL application the port number is put inside
the EditPort TEdit’s Text property on the main form, and in the stand-alone console
application the port value is hardcoded as parameter to the RunServer procedure at the
bottom of the project .dpr file.
If you checked the "HTTPS" option, then a third page will show up where you can specify
the details for the X.509 certificate (see the screenshot a few pages back), specifically the
Certificate file name, Key file name, Key file password and Root certificate file name are
required, and you can click on the Test button to validate these settings.
Obviously, you will not see these pages if you selected an ISAPI dynamic link library, since
that choice will produce an ISAPI DLL that you can deploy on IIS (which will determine the
ports to use for HTTP and HTTPS).
The next page of the New DataSnap WebBroker Application wizard can be used to select
the server features. This includes Authentication and Authorization (if Authentication is
also checked), as well as a Server Method Class and Sample Methods (if the Server
Methods Class is checked). New in Delphi XE2 are the options to include the Encryption
(RSA and PC1) and Compression (ZLib) filters, to generate the Mobile Connector files, and
finally a separate Server Module.
Using a separate Server Module means that the TDSServer component will be placed on
it’s own data module instead of using the Web Module for it. Using a separate server
module enables support for heavy weight callbacks (which will be covered later in more
detail).
By default, only the Server Methods class and Sample Methods options will be checked
(and you can even uncheck the Sample Methods to avoid these simple methods EchoString
and ReverseString).
The next page of the Wizard is used to define the base class for the server methods
ancestor class (note that you ONLY get this page if you selected to have a Server Method
Class – if you uncheck the Server Methods Class option, then a click on the Next button
will just enable the Finish button).
If you include a Server Methods Class, then the ancestor type can be TComponent,
TDataModule or TDSServerModule. TComponent is good enough if you only want to expose
Server Methods. TDataModule is a container for non-visual components, and can be used
for exposing Server Methods as well, while TDSServerModule has RTTI enabled for
methods from the start, and also implements the IAppServerFastCall interface methods, so
this is the ideal choice if you want to expose both TDataSets (using TDataSetProviders)
and server methods.
A TComponent does not offer a design area to place non-visual components, which you get
when you use a TDataModule or a TDSServerModule. In general, if you do not know on
beforehand if you want to expose TDataSets using TDataSetProviders, then you can best
pick the TDSServerModule so at least this functionality is potentially available when you
need it.
If you selected either the “Mobile Connectors” and/or the “Server Module” option (see
previous page), then the final page of the New DataSnap WebBroker Application wizard
will ask you where you save the project files. Make sure to specify a valid identifier again,
since the name will not only be used for the directory, but also as project name inside that
directory.
In that case, the DataSnap components are divided between the WebModuleUnit and the
extra ServerContainerUnit.
Without that extra ServerContainerUnit, the Web Module unit contains the following
components in the designer (depending on the selected features):
Note how both the two encryption filters and the compression filter are also part of the
TDSHTTPWebDispatcher (just like they were added to the HTTP and/or TCP transport
components in the regular DataSnap Application server project).
If we want to disable the rest/ or datasnap/ context requests, all we have to do is
implement this in the OnUserAuthenticate event of the DSAuthenticationManager
component, always returning False for the protocol that we do not want to support (this is
a quick way to disable REST if you only want to support DataSnap requests or vice versa).
Note that neither the OnUserAuthorize nor the OnUserAuthenticate event of the
TDSAuthenticationManager contains helpful comments like we get with the “normal”
DataSnap Application Server wizard.
If the option for a separate Server Module was checked, then the components on the Web
Module unit will be split. Some components will be left on the Web Module unit: like the
TDSHTTPWebDispatcher and the proxy generator components, but the DataSnap Server
component TDSServer, TDSServerClass and TDSAuthenticationManager will be placed on
the new Server Container unit.
The web module unit will be as follows:
While the extra ServerContainer unit is the place where the DataSnap server components
are placed:
This has a serious consequence for the TDSHTTPWebDispatcher component, which can no
longer be connected directly to the TDSServer component. Instead, we have to use a
function called DSServer, implemented in the Server Container unit to return the instance
of the TDSServer component.
implementation
uses
Winapi.Windows, ServerMethodsUnit1;
{$R *.dfm}
var
FModule: TComponent;
FDSServer: TDSServer;
FDSAuthenticationManager: TDSAuthenticationManager;
destructor TServerContainer2.Destroy;
begin
inherited;
FDSServer := nil;
FDSAuthenticationManager := nil;
end;
...
initialization
FModule := TServerContainer2.Create(nil);
finalization
FModule.Free;
end.
Personally, I think the use of global variables here could be solved in a more elegant way
by using class functions and class variables. But at least the global variables were added to
the (hidden) implementation section of the unit, so they are not visible outside of the
implementation.
The Web Module OnCreate event handler assigns and uses the global DSServer and
DSAuthenticationManager instances as follows:
DataSnap WebBroker applications were the basis for the DataSnap REST applications
(which add specific support for generating HTML / JavaScript client support).
The first two application types use the new TIdHTTPWebBrokerBridge class, either on the
main form of the VCL application, or created dynamically in the console application. The
TIdHTTPWebBrokerBridge class from the IdHTTPWebBrokerBridge.pas unit has one
important property: the DefaultPort which should be set to the HTTP port that the Indy
HTTP Server will bind itself to. The HTTP Server can be started by setting the Active
property to True (and stopped by setting Active back to False).
The first two application types produce executables that can be run on a server machine,
without need for a web server, although the firewall must explicitly allow incoming
requests at the specified DefaultPort of the TIdHTTPWebBrokerBridge class.
The version of Indy that ships with Delphi XE did not support HTTPS right away (there is
no property or line of code you have to change to ensure the TIdHTTPWebBrokerBridge
uses HTTPS instead of HTTP). However, HTTPS support is now possible with Delphi XE2.
The ISAPI dynamic link library produces an ISAPI DLL that can be deployed on Internet
Information Services (IIS). As a consequence, it’s easy to support HTTPS as well as HTTP,
and the port that is used for incoming connections is determined by IIS itself (and firewalls
should be configured accordingly, but that is not a specific concern of the DataSnap REST
application).
The second page of the wizard is only shown if you select a stand-alone VCL application or
a stand-alone console application, and can be used to specify the port number for the
HTTP communication. This page is not shown if you selected an ISAPI dynamic link library,
since in that case IIS is responsible for the HTTP or HTTPS port number assignments.
By default, the HTTP port number is set to 8080. You can click on buttons to Test the port
(to see if no other process is currently bound to that port), or to find an available open
port. Note that only once process can "listen" to a certain port number at the same time,
and if you have multiple DataSnap REST servers implemented as stand-alone VCL or
stand-alone console applications, then each of them require a unique HTTP port to listen
to, so choose your port numbers wisely!
Note that the HTTPS checkbox will enable the use of HTTPS. The wizard seems to suggest
to use port 443 for HTTPS, but IIS will already take that port, so you can only use 443 if
IIS is not already on your server.
This port number is used by the TIdHTTPWebBrokerBridge class, by the way, but you can
always change it later. In the stand-alone VCL application the port number is put inside
the EditPort TEdit’s Text property on the main form, and in the stand-alone console
application the port value is hardcoded as parameter to the RunServer procedure at the
bottom of the project .dpr file.
Again, for an ISAPI dynamic link library, this is not an issue, and multiple DataSnap ISAPI
DLLs can listen to the same port number, because it’s not the ISAPI DLL, but rather IIS
that listens to the port number and will redirect the request to the correct DataSnap
Server.
If you selected HTTPS, then the next page will allow you to enter the certification file and
information. Like shown before, you can specify the Certificate file name, Key file name,
Key file password, and the Root certificate file name for the X.509 certificate to use for the
HTTPS connection. When using IIS for an ISAPI DLL, this page will not be shown, since
this means you need to install the certificate with IIS (as shown elsewhere in this manual).
The next page can be used to select the DataSnap REST Server features. This includes
Authentication and Authorization (if Authentication is also checked), as well as a Server
Method Class and Sample Methods (if the Server Methods Class is checked), Mobile
Connectors and an extra Server Module. By default, the Authentication and Authorization,
Mobile Connectors and Server Module options are left unchecked.
Note that compared to the WebBroker DataSnap Application wizard, there is no option to
add the encryption or compression filters here. That’s because we cannot use the filters in
combination with REST. A REST client would not be able to apply the Filters on the data
stream (as with a DBX DataSnap Server). Make sure to add other ways to secure the data
and transport – at least by using HTTPS to use SSL for the HTTP connection.
The next page is used to define the base class for the server methods ancestor class (note
that you still also get this page if you selected no Server Method Class (reported against
Delphi XE as QC # 88210, but still present in Delphi XE2).
The last page is used to define the directory to produce the project in. This location is
required, because several additional files are created and made part of the project.
Note that the name that you enter here will be the name of the directory but also the
name of the project itself. As a consequence, you should make sure that the final part of
the path is a valid identifier (to avoid compiler errors), and doesn’t contain spaces or start
with a digit for example. Otherwise you’ll get an error telling you that this final part is not
a valid identifier, and no project will be generated (and you have to start all over again).
There are also several additional files generated for each of the REST application project
types, to produce a working JavaScript client for these REST applications. The additional
files can be found in the css directory (main.css and ServerFunctionInvoker.css), images
(collapse.png and expand.png), js (base64-min.js, base64.js, CallbackFramework-min.js,
CallbackFramework.js, connection.js, json2-min.js, json.js, ServerFunctionEexecutor-
min.js, ServerFunctionEexecutor.js, ServerFunctionInvoker.js, ServerFunctions.js), and
templates (ReverseString.html and ServerFunctionInvoker.html).
The Web Module acts as the Server Container unit, and can be compared a little bit with
the Web Module of the DataSnap WebBroker Applications. A little bit, because we’ll find
several additional components on the web module (or on the combined Web Module and
Server Container units).
Note that these two units – Web Module unit and Server Contrainer unit – are integrated if
you didn’t select the option to explicitly create a separate Server Method unit.
The two TPageProducer components are used to display a HTML page, taken from the
template directory.
The TDSProxyGenerator component is used to produce a new JavaScript proxy in file
ServerFunctions.js when needed (in practice, when the executable is more recent than the
ServerFunctions.js file, so just in case). This is helpful when we’ve extended the Server
Methods unit with new functions for example. Regenerating the ServerFunctions.js file
ensures that the client is always up-to-date.
The TWebFileDispatcher component is the one responsible for the actual trigger (at the
OnBeforeDispatch event) to regenerate the ServerFunctions.js file, calling the
DSProxyGenerator to do it’s job (when the executable is more recent than the JavaScript
file).
Finally, the TDSServerMetaDataProvider component is the one to supply the required
meta data to the TDSProxyGenerator.
The source code inside the DataSnap REST Web Module, compared to the DataSnap
WebBroker Web Module contains five new and one modified implemented event handlers:
- WebModuleDefaultAction (modified)
- ServerFunctionInvokerHTMLTag
- WebFileDispatcher1BeforeDispatch
- WebModuleBeforeDispatch
- AllowServerFunctionInvoker
- WebModuleCreate
The implementation of these event handlers take care of the REST client support using
HTML and (partly dynamic) generated JavaScript.
DataSnap
- VCL Forms Application – HTTP, HTTPS & TCP/IP
- Console Application - HTTP, HTTPS & TCP/IP
- Service Application - HTTP, HTTPS & TCP/IP
WebBroker
- Stand-alone VCL application - Indy HTTP, HTTPS
- Stand-alone console application - Indy HTTP, HTTPS
- ISAPI dynamic link library - IIS HTTP & HTTPS
REST
- stand-alone VCL application - Indy HTTP, HTTPS
- Stand-alone console application - Indy HTTP, HTTPS
- ISAPI dynamic link library - IIS HTTP & HTTPS
Testing with “normal” DataSnap Servers shows that TCP/IP is a much faster transport than
HTTP, so when using DataSnap servers, I prefer the use of TCP/IP (and it’s a good thing
the wizard has TCP/IP selected by default, and not HTTP).
Since REST clients can only be using HTTP or HTTPS, the consequence of using only
TCP/IP is that the normal DataSnap Servers using TCP/IP will only support DBX client
connections, and not REST (unless we add HTTP Tunneling).
When using the WebBroker or REST DataSnap Servers, only HTTP or HTTPS is available.
When it comes to HTTPS, I personally find it more secure to (have the web master) install
a certificate in IIS for a specific (sub)domain, so I mainly use WebBroker ISAPI DLLs.
The main difference between WebBroker and REST is the built-in support for REST clients.
Since these are useful, if only for local testing, I’m using the REST project targets now
instead of the WebBroker project targets.
Also, in all cases, I personally find the VCL Forms application more easy to use for tracing
and debugging (by adding visual controls – for single client testing) compared to console
applications, so I seldom create or use DataSnap console targets.
This leaves four targets that I often use, and I’ve created a project group with a skeleton
implementation of these four DataSnap project targets:
Or in table format:
DBX REST
DataSnap VCL (debug) TCP/IP
DataSnap Service TCP/IP
REST VCL (debug) HTTP(S) HTTP(S)
REST ISAPI HTTPS HTTPS
Note that the DataSnap Service can support DBX and REST over HTTPS, but it’s not as fast
as TCP/IP, so not an ideal solution.
2. DataSnap Security
Security is a very important aspect of any application, especially multi-tier applications
where – often sensitive - data is being sent from the client to the server and back.
Security should not be added as an afterthought. That’s why I want to cover security
details and options here before covering the DataSnap Server components, DataSnap
Server Deployment and DataSnap Clients in more detail.
This leads to the following combination of protocols and filters for a secure encryption of
the data between client and server (for the most useful four DataSnap targets defined in
the previous section):
DBX REST
DataSnap VCL (debug) TCP/IP + RSA / PC1 Filters
DataSnap Service TCP/IP + RSA / PC1 Filters
REST VCL (debug) HTTP(S) HTTP(S)
REST ISAPI HTTPS HTTPS
For “standard” DataSnap Servers, we should limit ourselves to TCP/IP in combination with
the RSA / PC1 Filter. Using HTTP(S) is an option here, but not recommended because of
performance reasons (TCP/IP is much faster).
For REST DataSnap Servers, we should use HTTPS for the deployment, in which case no
additional filters are needed (but the quality of the encryption is defined by the HTTPS /
SSL certificate installed in IIS – see next section). For ISAPI DLLs, the use of filters may
be less optimal, since the ISAPI DLL uses has a maximum buffer size of 48K (although this
is configurable by increasing the uploadReadAheadSize value in IIS_Schema.xml).
We need to explicitly add the filters to the DataSnap Server, to the transport components.
These are the TDSTCPServerTransport component and/or the TDSHTTPService component
on the Server Container unit. Both have a Filters property, that uses a property editor to
add filter items to the TTransportFilterCollection.
For each individual TTransportFilterItem, we can assign the FilterId in the Object
Inspector. By default, Delphi XE2 ships with the PC1, RSA and ZlibCompression filters. For
secure encryption, we need to add both the PC1 and the RSA filters. This will result in the
keys being sent using RSA, and the data encryption being done with PC1.
In order for DataSnap client applications to use the filters, we only need to ensure that the
unit which contains the filter definitions (and registration) is added to the uses clause. For
the PC1 filter that’s the Data.DBXTransport unit (which is required anyway), for the RSA
filter it’s the Data.DBXRSAFilter unit, and for the ZlibCompression filter it’s the
Data.DbxCompressionFilter unit.
If you need filters to secure your data transfer in a DataSnap Service application for
example, I suggest using TCP/IP as transport and to try to avoid HTTP(S) since the
HTTP(S) throughput is much slowed compared to using TCP/IP (and filters will only make it
worse). If your clients need to use HTTP(S), then it’s highly recommended to start using
HTTPS which means deployment of a X.509 certificate or deployment as a web server
application – specifically an ISAPI DLL on IIS.
For DataSnap WebBroker / REST applications, we can assign the PC1 and RSA filters to the
Filters property of the TDSHTTPWebDispatcher component (covered later).
OpenSSL DLLs
DataSnap relies on OpenSSL for the encryption of the keys in the RSA + PC1 filter
combination; which requires some manual assembly for 64-bit applications.
If you deploy 32-bit DataSnap servers, using the RSA and PC1 filters, then you must also
deploy two Indy specific OpenSSL DLLs: libeay32.dll and ssleay32.dll – or make sure they
already exist somewhere in the path of the server machine. These two DLLs are needed
for the RSA filter (which encrypts the password used by the PC1 filter). Without these two
DLLs, any client who wants to connect to the server will get an “Connection Closed
Gracefully” message, because the server was unable to load the two DLLs to start the RSA
filter to encrypt the PC1 keys, etc. The same two DLLs will also be required for any
DataSnap client, whether connected to the TCP/IP server using the RSA and PC1 filters, or
whether connected to the ISAPI filter using HTTPS.
When Delphi is installed, three 32-bit versions of the OpenSLL DLLs can be found on your
machine (actually all three slightly different):
- C:\Program Files\CollabNet
- C:\Program Files\Embarcadero\RAD Studio\9.0\bin\SubVersion
- C:\Program Files\FinalBuilder 7 XE2
Even on a 64-bit Windows machine, I could find no 64-bit version of OpenSSL, but you can
download a build from http://www.slproweb.com/products/Win32OpenSSL.html.
In the archive, you’ll find libeay32.dll and ssleay32.dll. Note that these are 64-bit versions
of the OpenSLL DLLs, having the same name as the 32-bit counterparts. Make sure they
do not end up in your 32-bit Windows directory. In fact, you may want to make sure that
they are in the directory of your 64-bit DataSnap Server and/or 64-bit DataSnap Client
only.
Apart from securing the data, DataSnap also allows us to add authentication (using login
username and password) and authorization (who is allowed to do what). These techniques
are implemented using the TDSAuthenticationManager (enhanced in Delphi XE), and can
be used independent of the communication protocol.
Authentication
Authentication is the process of verifying a username and password to allow someone
access to the DataSnap Server. At the DataSnap server side, this is implemented by the
TDSAuthenticationManager’s OnUserAuthenticate event handler. This event handler gives
us four parameters that we can use, and trace using CodeSite (add CodeSiteLogging to the
uses clause):
procedure TMyServerContainer.DSAuthenticationManager1UserAuthenticate(
Sender: TObject; const Protocol, Context, User, Password: string;
var valid: Boolean; UserRoles: TStrings);
begin
CodeSite.Send('User: ' + User);
CodeSite.Send('Password: ' + Password);
CodeSite.Send('Protocol: ' + Protocol);
CodeSite.Send('Context: ' + Context);
valid := True; // check for User and Password
end;
The User and Password are values passed by the client. Protocol can be tcp/ip, http, https,
or datasnap, and Context can be empty or tunnel (for HTTP tunneling, covered later).
This way, we will only allow DataSnap connections from clients using the HTTPS protocol.
Authorization
Authorization is the process of determining who is allowed to do what. This is implemented
using roles in DataSnap XE. In the OnUserAuthenticate event again, we can assign
UserRoles to the user who just logged in.
For example, user ‘bob’ can get the role ‘admin’ added to his UserRoles, while any other
use will get the role ‘guest’ added:
procedure TMyServerContainer.DSAuthenticationManager1UserAuthenticate(
Sender: TObject; const Protocol, Context, User, Password: string;
var valid: Boolean; UserRoles: TStrings);
begin
CodeSite.EnterMethod('OnUserAuthenticate');
valid := LowerCase(Protocol) <> 'http'; // don't allow http
if LowerCase(User) = 'bob' then
UserRoles.Add('admin')
else UserRoles.Add('guest');
CodeSite.ExitMethod('OnUserAuthenticate ' + BoolToStr(valid));
end;
The actual roles can be verified in the OnUserAuthorize event handler. Note that Delphi by
default returns True in the OnUserAuthenticate and OnUserAuthorize event handlers,
which means anyone can login and do anything they want.
The OnUserAuthorize event handler gets an EventObject that has properties for the
UserName (we should already know that), the UserRoles, the DeniedRoles (if the user has
any of these roles, then the user is not allowed access), and the AuthorizedRoles (if the
user has any of these roles, then the user is allowed access).
There are two ways to check for and enforce authorization: either you define that
everything is allowed unless the user has a role in the “DeniedRoles” set (an optimistic
approach), or you define that nothing is allowed unless the user has a role in the
“AuthorizedRoles” set (a more pessimistic approach).
The following example assumes that the user is allowed access, unless one of the roles of
the user is in the DeniedRoles set:
procedure TMyServerContainer.DSAuthenticationManager1UserAuthorize(
Sender: TObject; EventObject: TDSAuthorizeEventObject;
var valid: Boolean);
var
UserRole: String;
begin
CodeSite.EnterMethod('OnUserAuthorize');
valid := True;
if Assigned(EventObject.DeniedRoles) then
for UserRole in EventObject.UserRoles do
if EventObject.DeniedRoles.IndexOf(UserRole) >= 0 then valid := False;
Note that while the UserName and UserRoles properties of the EventObject are defined,
the DeniedRoles and AuthorizedRoles may be empty (and may not be assigned at all).
This leaves one question: how to we specify the DeniedRoles and AuthorizedRoles? For
that, we have to take a closer look at the server methods class definition.
The attribute only has effect on the declaration that follows the attribute. In this case, the
ReverseString function either requires the role ‘admin’ or is denied for the role ‘guest’. The
default behavior of Delphi is to allow all other roles (so the optimistic approach).
Note that the DataSnap.DSAuth unit is already added to the uses clause, so the TRoleAuth
class is known.
Apart from using the TRoleAuth class at the method level, we can also add it to the entire
Server Method class:
type
[TRoleAuth('client', 'dummy')]
TMyServerMethods = class(TDSServerModule)
public
function EchoString(Value: string): string;
[TRoleAuth('admin', 'guest')]
function ReverseString(Value: string): string;
end;
This specifies that the ‘client role is good enough to explicitly get access to all methods in
the class, while the ‘dummy’ role denies access to the members. The ReverseString
function specifies that it is accessible for the ‘admin’ role, and is denied for the ‘guest’ role.
Note that the TRoleAuth attribute overrides itself, but does not inherit higher permissions
(so for the ReverseString method, the role ‘dummy’ is no longer denying access in this
example).
Using this code, a user called ‘Bob’ can login and gets access to both methods with the
given example implementations of the OnUserAuthorize and OnUserAuthenticate event
handlers.
TDSRoleItems
We can also define a collection of TDSRoleItems with “Authorized”, “Denied” and “ApplyTo”
values using the Roles property of the TDSAuthenticationManager component (which is
usually on the Server Container) where Authorized and Denied contain the names of the
roles, separated with commas, and ApplyTo contains a list of ClassNames, MethodNames
or MethodName.ClassName values.
As an example, we can ensure that only the user “admin” can execute the EchoString
method, and user “guest” cannot, which configured in the TDSAuthenticationManager’s
Roles property editor as follows:
Note that we can use this technique also to prevent users from sending updates to a
TDataSetProvider using the ApplyUpdates method from a TClientDataSet at the client side.
For this, we need to assign “AS_ApplyUpdates” in the ApplyTo field, and then any role that
will allow this (like “admin”) in the AuthorizedRoles field.
This will ensure that the method AS_ApplyUpdates will be available for users with the role
“admin”, so they could call the ClientDataSet.ApplyUpdates method, but users with any
other role will be unable to call it, and will get a nice error message that authorization was
denied.
The only minor downside is that I can disable AS_ApplyUpdates in general, or for a specific
server methods class (in the combination classname.methodname), but not for a specific
DataSetProvider.
Summary: Security
We can use HTTPS to secure the transport layer for ISAPI DLLs, and a combination of the
RSA and PC1 encryption filters to secure TCP/IP (and optionally HTTP and HTTPS
transports – although these are slow compared to TCP/IP).
We can also use authorization (who are you) and authentication (what are you allowed to
do) as additional security measurement, including a role-based authentication mechanism
to secure server methods classes and methods (as well as TDataSetProviders exported
from the server method class).
A few of these components will be covered in later sections (for example using callbacks or
the meta data components). The rest can be found in the ServerContainer or Web Module.
TDSServer
The TDSServer component has a total of six properties: AutoStart, ChannelQueueSize,
ChannelResponseTimeout (new in XE2), HideDSAdmin, Name and Tag.
The AutoStart property is set to True by default, which means that the DataSnap Server
will start as soon as the form is created. If you don’t set AutoStart to True, you can
manually call the Start method, and you can always call the Stop method. You can use the
Started function to find out if the DataSnap Server is already started.
The ChannelQueueSize property is new in Delphi XE, and defines the maximum number
of messages that can be queued for sending to a specific client at once. If the queue with
messages ever reaches that size, then any further messages will be ignored until the
queue is getting smaller again (messages are sent to the client). Note that this does not
remove the oldest message, but instead ignores the latest (newest) message.
The default value of ChannelQueueSize is 100 messages. You can change that number. If
you set it to -1, you get 255 messages.
Note that server methods from (other) server classes will still be able to call the DSAdmin
methods at the server side (they are not removed, simply no longer exposed with
HideDSAdmin set to True).
The TDSServer component has five events: OnConnect, OnDisconnect, OnError, OnPrepare
and OnTrace. We can write event handlers for these five events to respond to the different
situations, for example by writing a line of text to a log file.
The OnConnect, OnDisconnect, OnError and OnPrepare events have an argument derived
from TDSEventObject, which contains properties for the DxContext, the Transport, the
Server and the DbxConnection component. The TDSConnectEventObject type, used for the
OnConnect and OnDisconnect also contains ConnectionProperties as well as read-only
ChannelInfo.
TDSConnectEventObject = class(TDSEventObject)
...
public
property ConnectProperties: TDBXProperties
read FConnectProperties write FConnectProperties;
property ChannelInfo: TDBXChannelInfo read FChannelInfo;
end;
The TDSErrorEventObject also includes the Exception that caused the error, and the
TDSPrepareEventObject includes properties for the MethodAlias and the ServerClass that
we want to use.
TDSServerMethodUserEventObject = class(TDSServerMethodEventObject)
...
public
property UserName: string read GetUserName;
property UserRoles: TStrings read FUserRoles;
property AuthorizedRoles : TStrings read FAuthorizedRoles;
property DeniedRoles : TStrings read FDeniedRoles;
TDSServerMethodEventObject = class(TDSEventObject)
...
public
property ServerClass: TDSCustomServerClass read FServerClass write FServerClass;
property MethodAlias: UnicodeString read FMethodAlias;
property MethodInstance: TObject read FMethodInstance;
end;
TDSErrorEventObject = class(TDSEventObject)
...
public
property Error: Exception read FError write FError;
end;
During the OnConnect event handler, we can examine the ChannelInfo for the connection,
for example:
procedure TServerContainer1.DSServer1Connect(DSConnectEventObject: TDSConnectEventObject);
var
i: Integer;
begin
CodeSite.EnterMethod('DSServer1.OnConnect');
CodeSite.Send('ConnectProperties: ' +
DSConnectEventObject.ConnectProperties.Properties.Text);
if Assigned(DSConnectEventObject.ChannelInfo) then
begin
CodeSite.Send('ChannelInfo.Id: ' + IntToStr(DSConnectEventObject.ChannelInfo.Id));
CodeSite.Send('ChannelInfo.Info: ' + DSConnectEventObject.ChannelInfo.Info);
end;
if Assigned(DSConnectEventObject.Transport) then
begin
CodeSite.Send('Transport: ' + DSConnectEventObject.Transport.ToString);
CodeSite.Send('Transport.BufferKBSize: ' +
IntToStr(DSConnectEventObject.Transport.BufferKBSize));
for i:=0 to DSConnectEventObject.Transport.Filters.Count-1 do
CodeSite.Send('Transport.Filter: ' +
DSConnectEventObject.Transport.Filters.GetFilter(i).Id);
end;
CodeSite.ExitMethod('DSServer1.OnConnect');
end;
This will show the connection details (as well as the OnAuthenticate details):
Apart from OnConnect, we can also implement the OnDisconnect event handler, to report
that a connection is now disconnected.
The OnTrace event handler has an argument of type TDBXTraceInfo. We can implement
the OnTrace event handler of the TDSServer component, and log the contents of the
TraceInfo.Message to get a good idea what the server is doing.
function TServerContainer1.DSServer1Trace(TraceInfo: TDBXTraceInfo): CBRType;
begin
CodeSite.EnterMethod('DSServer1.OnTrace');
CodeSite.Send('Trace Category: ' + TraceInfo.CustomCategory);
CodeSite.Send('Trace Message: ' + TraceInfo.Message);
Result := cbrUSEDEF; // take default action
CodeSite.ExitMethod('DSServer1.OnTrace');
end;
We need to add the Data.DBXCommon and Data.DBCommonTypes units to the uses clause
to make this compile by the way. Note that at the client side you can also trace the
communication between the DataSnap client and server by using a TSQLMonitor
component connected to the TSQLConnection component. An example of the trace output
can be as follows:
As you can see, the TraceInfo.Message contains information about the number of bytes as
well as the method that was called.
There is also an OnPrepare event, as well as an OnError event handler for the TDSServer.
Especially the OnError event handler is important to at least log the error (for example
using CodeSite) in order to keep track of problems.
procedure TServerContainer1.DSServer1Error(
DSErrorEventObject: TDSErrorEventObject);
begin
CodeSite.SendException(DSErrorEventObject.Error);
end;
TDSServerClass
The TDSServerClass component is responsible for specifying the class at the server side
which is used to expose published methods to the remote clients (using dynamic method
invocation).
The TDSServerClass component has a Server property that points to the TDSServer
component. The other important property – apart from the Name and Tag properties – is
the LifeCycle property. By default it’s set to Session, but it can also be set to Server or
Invocation. From long life to short life, the meaning of Server, Session and Invocation
mean that one instance of the class is used for the entire lifetime of the server, for the
lifetime of the DataSnap session, or for the single invocation of a method.
Session
A lifecycle value of Session means that each incoming connection will get its own instance
of the server class. This is a state-full solution. It means that we can ask for “the next 10
records” for example, for a specific client connection, since the associated instance of the
server class will have remembered the current record (since it’s just one instance for the
client connection).
The benefit of using this LifeCycle approach is that it’s easy to program. And it’s the
default setting of DataSnap, so by default all DataSnap Servers will use the Session
LifeCycle, and have the server class at the DataSnap server side maintain a unique
instance for each individual client connection.
The downside of this approach is that the server maintains (in memory) this unique
instance of the server class for each individual client connection. This means that the
DataSnap Server is not really scalable beyond a few hundred client connections (and
possibly even less, depending on the memory requirements of your DataSnap server
class). If you have or expect to get more than a few hundred concurrent client
connections, you should consider a LifeCycle of Invocation instead.
Invocation
If you change the LifeCycle value to Invocation, you will end up with a state-less server
class. Each new request will get a new instance of the server class, and this instance will
be released right after the request is finished. So a subsequent request will most likely get
a different instance next time a request is made (even if instances of the server class are
pooled instead of immediately freed).
As a consequence, the server class instance will not be able to hold state information from
clients, resulting in a truly stateless DataSnap server. This is the best solution for a
scalable architecture.
The benefit of using this LifeCycle approach is that you get a scalable DataSnap
architecture, able to service a significant number of concurrent client connections. The
number of connections no longer has a relation to the number of sessions (there no longer
are sessions), so more connections does not lead to more memory requirements on the
server.
The downside of using this approach is that it’s a bit harder to program. Since the server is
stateless, we can no longer request the “next 5 records”, but have to pass the state
information from the client to the server with each request that requires state information.
A more detailed example of this will be given later in this courseware manual.
It’s possible to start with a state-full LifeCycle set to Session, while changing this to
Invocation later. Note that apart from changing the LifeCycle property, you will also need
to implement some events to pass the state information from client to server.
Server
When you change the LifeCycle to Server, it means that a single server class instance is
shared with all incoming connections and requests. This can be useful if you want to
“count” the number of requests for example, but you must ensure that no threading issues
can occur (when multiple requests come in and expect to be handled simultaneously).
When using Delphi 2009, or when manually placing the TDSServerClass component on a
container, the OnGetClass event needs to be implemented by us, since we need to specify
which class will be “remoted” from the server to the client. The DataSnap wizards in Delphi
2010 and higher, however, will now automatically implement the OnGetClass event
handler for us, as follows:
procedure TServerContainer1.DSServerClass1GetClass(
DSServerClass: TDSServerClass; var PersistentClass: TPersistentClass);
begin
PersistentClass := ServerMethods.TMyServerMethods;
end;
Note that you may have to modify this code if you save the ServerMethods in a different
unit and/or if you rename the Server Methods class.
TDSTCPServerTransport
The TDSTCPServerTransport components is responsible for the communication between
the DataSnap server and the DataSnap clients, and is using TCP/IP as communication
protocol. The TCP/IP communication protocol turn out to be faster than the HTTP or HTTPS
communication protocol.
TDSHTTPService
The TDSHTTPService component is responsible for the communication between the
DataSnap server and the DataSnap clients using the HTTP protocol. Note that like the
TDSTCPServerTransport component, the TDSHTTPService component is only used in
‘normal’ DataSnap server applications. For WebBroker based DataSnap (REST) server
applications, the HTTP(S) communication is done using either a TIdHTTPWebBrokerBridge
component (using an Indy HTTP Server) for DataSnap executables, or using IIS to handle
the HTTP(S) transport for DataSnap ISAPI DLLs.
The TDSHTTPService component has 17 properties (apart from the Name and Tag
property): Active, AuthenticationManager, CacheContext (introduced in Delphi XE),
CertFiles (new in Delphi XE2), CredentialsPassThrough (introduced in Delphi XE),
DSAuthPassword, DSAuthUser and DSContext (all three introduced in Delphi XE),
DSHostname, DSPort, Filters, HttpPort, IPImplementationID (new in Delphi XE2),
RESTContext, Server, SessionTimeout, and the read-only ServerSoftware property.
The RESTContext property defines the REST context URL that we can use to call the
DataSnap server as a REST service. By default, the RESTContext is set to “rest”,. which
means we can call the server as http://localhost/datasnap/rest/... as I’ll demonstrate in
section on REST, JSON and client callback methods.
The Server property should be pointed to a TDSServer component on the same container.
If you do not want to connect the Server property, you can use the DSHostname and
DSPort properties to connect to the DataSnap Server using TCP/IP. When the Server
property is set, the DSHostname and DSPort properties are ignored.
Finally, the SessionTimeout property is used to define the life-time of the HTTP session.
By default it’s set to 1200000 milliseconds, or 120 seconds (2 minutes).
The TDSHTTPService component used to have five events: four REST related, and one
Trace event. The REST events are no longer present, and apart from the OnTrace, there is
also an OnFormatResult event handler now.
The OnTrace event can be used to trace the calls to the DSHTTPService component, for
example as follows:
procedure TServerContainer1.DSHTTPService1Trace(Sender: TObject;
AContext: TDSHTTPContext; ARequest: TDSHTTPRequest;
AResponse: TDSHTTPResponse);
begin
CodeSite.EnterMethod('DSHTTPService1Trace');
CodeSite.Send(AContext.ToString);
CodeSite.Send(ARequest.Document);
CodeSite.Send(AResponse.ResponseText);
CodeSite.ExitMethod('DSHTTPService1Trace');
end;
Note that the HTTP Trace information will only be shown when the client is actually using
HTTP to connect to the server (and not the default TCP/IP protocol instead).
An example of the CodeSite message output can be as follows:
After entering the DSHTTPService OnTrace event, we first get the DSServer OnTrace, the
OnUserAuthenticate, and DSServer OnConnect events, after which the contents of the
DSHTTPService OnTrace follows with the output that states that the AContext is set to
TDSHTTPContextIndy, the ARequest is set to /datasnap/tunnel, and the AResponse is set
to OK. And finally, we see the exit of the DSHTTPService OnTrace event.
The other event handler of the TDSHTTPService component is the OnFormatResult, which
can be implemented as follows:
procedure TServerContainer1.DSHTTPService1FormatResult(Sender: TObject;
var ResultVal: TJSONValue; const Command: TDBXCommand; var Handled: Boolean);
begin
end;
Note that we need to set Handled to True if we actually handled the formatting ourselves.
The WebFileExtensions property defines the web file extensions that are supported by
the TDSHTTPServiceFileDispatcher. In the example above (on the right), the extensions
css, html;htm, js, jpeg;jpg and png are included as supported types.
end;
procedure TServerContainer1.DSHTTPServiceProxyDispatcher1AfterDispatch(
Sender: TObject; const AFileName: string; AContext: TDSHTTPContext;
Request: TDSHTTPRequest; Response: TDSHTTPResponse; var Handled: Boolean);
begin
end;
Using these event handlers we can log which files (and proxies) are dispatched, or
overrule the behavior and dispatch our own files or proxies (setting the Handled argument
to true if no further processing by the component is required).
TDSAuthenticationManager
Delphi 2010 introduced the TDSHTTPServiceAuthenticationManager component, which only
supported HTTP Authentication for the HTTP communication protocol. This was not very
useful, since there was no support for HTTPS.
Delphi XE introduced the TDSAuthenticationManager component, which can be used for
TCP/IP, HTTP and HTTPS communication protocols. The TDSAuthenticationManager
component must be connected to the AuthenticationManager property of a
TDSHTTPService and/or TDSTCPServerTransport components.
The TDSHTTPServiceAuthenticationManager component has one important property called
Roles, which can contain a list of TDSRoleItems. A TDSRoleItem in turn consists of three
properties: ApplyTo, AuthorizedRoles, and DeniedRoles. The ApplyTo part can be used to
specify a class name (in which case it applies to all methods of that class), a method name
(in which case it applies to all methods with that name, no matter from which class), or a
class name followed by a dot followed by the method name, which is the most specific way
to specify a method of a particular class.
As an example, in the previous section we defined a class TMyServerMethods with two sets
of authentication roles, as follows:
type
[TRoleAuth('client', 'dummy')]
TMyServerMethods = class(TDSServerModule)
private
{ Private declarations }
public
{ Public declarations }
function EchoString(Value: string): string;
[TRoleAuth('admin', 'guest')]
function ReverseString(Value: string): string;
end;
This specifies that the ‘client’ role should be good enough to explicitly get access to all
methods in the class, while the role 'dummy' should deny all access. It doesn't work that
way for the ReverseString method, however. Note that the TRoleAuth attribute overrides
itself, but does not inherit higher permissions (so for the ReverseString method, the role
‘client' is not explicitly allowing access, nor does the role 'dummy' deny access.
This can also be expressed using the Roles property of the TDSAuthenticationManager as
follows:
We can also add these in code, by adding individual TDSRoleItem instances to the
TDSAuthenticationManager’s Roles collection.
procedure TServerContainer1.DSAuthenticationManager1UserAuthorize(
Sender: TObject; EventObject: TDSAuthorizeEventObject;
var valid: Boolean);
begin
{ TODO : Authorize a user to execute a method.
Use values from EventObject such as UserName, UserRoles, AuthorizedRoles and DeniedRoles.
Use DSAuthenticationManager1.Roles to define Authorized and Denied roles
for particular server methods. }
valid := True;
end;
For a DataSnap WebBroker server, the web module does not contain the
TWebFileDispatcher, the ServerFunctionInvoker and ReverseString TPageProducers, the
TDSProxyGenerator and the TDSServerMetaDataProvider component.
TDSRestConnection
TDSRestConnection (like TSQLConnection) is not really a DataSnap Server component, but
can be used in a DataSnap REST Client to connect to a DataSnap REST Server. We will see
this component in more detail in the next section.
TDSHTTPWebDispatcher
The TWebFileDispatcher component is used in the web module for DataSnap REST server
applications. It is a WebBroker component that dispatches the DataSnap specific requests.
It contains no less than 15 properties: AuthenticationManager (which can be connected to
a TDSAuthenticationManager component), CacheContext, CredentialsPassThrough,
DSAuthPassword, DSAuthUser, DSContext, DSHostname, DSPort, Filters, Name,
RESTContext, Server (which can be connected to a TDSServer component),
SessionTimeout, Tag and WebDispatch.
CacheContext, DSContext, and RESTContext are predefined and should not be changed.
CacheContext is used as string in the http://www.bobswart.com/datasnap/cache/... URL,
same for the DSContext (the datasnap) and RESTContext (the REST context) values.
The DSHostname, and DSPort values are only used when the Server property is not
assigned. DSAuthPassword and DSAuthUser are used to specify the credentials to connect
to the DSHostname and DSPort when needed.
SessionTimeout specifies the time in milliseconds a session will remain valid. If a session is
invalid, it will be marked as expired and may be removed. The default value is 1200000
which means 120 seconds (or two minutes). Note that this is only the time used by the
TDSHTTPWebDispatcher to dispatch a datasnap request. If you set SessionTimeout to 0
then the session will never expire.
The property WebDispatch is used to define the type of requests that will be processed. By
default, in the generated DataSnap REST applications, the WebDispatch methodType will
be set to mtAny and the WebDispatch PathInfo will be set to datasnap* in order to handle
all datasnap* requests for us.
The default settings of the TDSHTTPWebDispatcher component on the generated web
modules is as follows:
object DSHTTPWebDispatcher1: TDSHTTPWebDispatcher
DSContext = 'datasnap/'
RESTContext = 'rest/'
CacheContext = 'cache/'
Server = DSServer1
DSHostname = 'localhost'
DSPort = 211
Filters = <>
AuthenticationManager = DSAuthenticationManager1
CredentialsPassThrough = False
SessionTimeout = 1200000
WebDispatch.MethodType = mtAny
WebDispatch.PathInfo = 'datasnap*'
end
WebActions
For the DataSnap REST server projects, the web module has the following three actions
defined:
TPageProducers
Returning to the two TPageProducer components, let’s examine the templates files
specified in their HTMLFile property as well as their OnHTMLTag property in a bit more
detail now. Both the ReverseString.html and ServerFunctionInvoker.html files start with
the same set of include JavaScript files:
<script type="text/javascript" src="js/base64.js"></script>
<script type="text/javascript" src="js/json-min.js"></script>
<script type="text/javascript" src="js/serverfunctionexecutor.js"></script>
<script type="text/javascript" src="js/connection.js"></script>
<script type="text/javascript" src="<#serverfunctionsjs>"></script>
Note the <#serverfunctionsjs> tag which will be resolved dynamically in the OnHTMLTag
event handler. The ServerFunctionInvoker.html also includes the JavaScript
js/ServerFunctionInvoker.js file, and both include a custom .css file from the css directory.
Then, both HTML files contain some JavaScript code that checks the login credentials if
these are required. It’s a but funny that this code differs in the two HTML files, but the
common parts of the JavaScript implementation is roughly as follows:
function onLoad()
{
showTime();
loginRequired = <#loginRequired>;
setConnection('<#host>', '<#port>', '<#urlpath>');
if (loginRequired)
{
showLogin(true);
}
else
{
showLogin(false);
}
}
This will ensure that the login option is not shown when it’s not needed. The special tags
<#loginRequired>, <#host>, <#port>, and <#urlpath> need to be dynamically resolved
by the OnHTMLTag event handler of course.
ReplaceText := ServerMethods.TMyServerMethods.ClassName
else if SameText(TagString, 'loginrequired') then
if DSHTTPWebDispatcher1.AuthenticationManager <> nil then
ReplaceText := 'true'
else
ReplaceText := 'false'
else
if SameText(TagString, 'serverfunctionsjs') then
ReplaceText := string(Request.InternalScriptName) + '/js/serverfunctions.js'
else
if SameText(TagString, 'servertime') then
ReplaceText := DateTimeToStr(Now)
else
if SameText(TagString, 'serverfunctioninvoker') then
if AllowServerFunctionInvoker then
ReplaceText :=
'<div><a href="' + string(Request.InternalScriptName) +
'/ServerFunctionInvoker" target="_blank">Server Functions</a></div>'
else
ReplaceText := '';
end;
Note that the servertime #-tag is replaced, but not actually included in any of the HTML
files. This has been replaced – I reckon – by the JavaScript function showTime() which is
called in the onLoad() function.
TWebFileDispatcher
The TWebFileDispatcher component is used on the DataSnap REST web module, but is not
really a DataSnap component. It’s in fact a “normal” WebBroker internet component, used
to dispatch file requests for web applications.
The TWebFileDispatcher contains five properties: Name, RootDirectory (by default the
current directory), Tag, WebDirectories and WebFileExtensions. The WebDirectories
property is a collection of directories for which this component will dispatch the files if a
request comes in. The WebFileExtensions property contains a list of file extensions and
mime types. For the generated DataSnap REST data module, the TWebFileDispatcher is
initially defined as follows:
object WebFileDispatcher1: TWebFileDispatcher
WebFileExtensions = <
item
MimeType = 'text/css'
Extensions = 'css'
end
item
MimeType = 'text/javascript'
Extensions = 'js'
end
item
MimeType = 'image/x-png'
Extensions = 'png'
end>
BeforeDispatch = WebFileDispatcher1BeforeDispatch
WebDirectories = <
item
DirectoryAction = dirInclude
DirectoryMask = '*'
end
item
DirectoryAction = dirExclude
DirectoryMask = '\templates\*'
end>
RootDirectory = '.'
end
The TWebFileDispatcher component is not needed when building an ISAPI DLL, since IIS
will be used to host the ISAPI DLL and dispatch the files. But for WebBroker based
DataSnap servers, the additional files from the css, images, and js directories need to be
dispatched.
This TWebFileDispatcher component is also responsible for the actual trigger (at the
OnBeforeDispatch event) to regenerate the ServerFunctions.js file, calling the
DSProxyGenerator to do it’s job (when the executable itself is more recent than the
JavaScript file).
procedure TWebModuleREST.WebFileDispatcher1BeforeDispatch(Sender: TObject;
const AFileName: string; Request: TWebRequest; Response: TWebResponse;
var Handled: Boolean);
var
D1, D2: TDateTime;
begin
Handled := False;
if SameFileName(ExtractFileName(AFileName), 'serverfunctions.js') then
if FileAge(AFileName, D1) and FileAge(WebApplicationFileName, D2) and (D1 < D2) then
begin
DSProxyGenerator1.TargetDirectory := ExtractFilePath(AFileName);
DSProxyGenerator1.TargetUnitName := ExtractFileName(AFileName);
DSProxyGenerator1.Write;
end;
end;
Note that this can be a problem for ISAPI DLLs, since it means that the ISAPI DLL has ti
have write access to the “js” directory where the JavaScript files are stored. Which is not
very secure, to say the least.
Personally, I do not like that, so whenever the ISAPI DLL changes, I also regenerate the
ServerFunctions.js file (by running the stand-alone executable) and upload both files to
the web server, and place one additional line of code to ensure that the check and possible
copy of the serverfunctions.js file is only done when the project is not a library (i.e. when
it’s not an ISAPI DLL but a standalone VCL application).
procedure TWebModuleREST.WebFileDispatcher1BeforeDispatch(Sender: TObject;
const AFileName: string; Request: TWebRequest; Response: TWebResponse;
var Handled: Boolean);
var
D1, D2: TDateTime;
begin
Handled := False;
if not IsLibrary then // do not regenerate for ISAPI DLL
if SameFileName(ExtractFileName(AFileName), 'serverfunctions.js') then
if FileAge(AFileName, D1) and FileAge(WebApplicationFileName, D2) and (D1 < D2) then
begin
DSProxyGenerator1.TargetDirectory := ExtractFilePath(AFileName);
DSProxyGenerator1.TargetUnitName := ExtractFileName(AFileName);
DSProxyGenerator1.Write;
end;
end;
TDSClientCallbackChannelManager
The TDSClientCallbackChannelManager component can be used to manage the callback
channels from REST clients to DataSnap REST servers. This component will be covered in
more detail in the callback section.
TDSServerMetaDataProvider
The TDSServerMetaDataProvider component is the one to provide the meta data for the
TDSProxyGenerator. The meta data provider is connected to a TDSServer component. The
DataSnap REST server project uses a TDSServerMetaDataProvider component to feed the
meta data requirements of the TDSProxyGenerator component.
TDSConnectionMetaDataProvider
The TDSConnectionMetaDataProvider component is the one to provide the meta data for
the TDSProxyGenerator. The TDSConnectionMetaDataProvider is connected to a
TSQLConnection component.
TDSRestMetaDataProvider
The TDSRestMetaDataProvider component is the one to provide the meta data for the
TDSProxyGenerator. The meta data provider is connected to a TDSRestConnection
component.
private
FFirstClass: TDSProxyClass;
FLastClass: TDSProxyClass;
procedure Clear;
public
property Classes: TDSProxyClass read FFirstClass;
end;
Note that there are no authorization attributes in the class at this time, so all methods can
be called by any client.
The implementation of the ServerTime method is really easy, almost as short as the
example methods EchoString and ReverseString:
function TMyServerMethods.ServerTime: TDateTime;
begin
Result := Now
end;
For server method parameter types that are not directly supported, we can always convert
these into native types like String, or pass them in a JSON value. See the section on JSON
and callback channels for more information on using JSON.
MaxParams
There is a maximum number of ServerMethods parameters that can be used, and this
maximum is hardcoded as 32 in the System.ObjAuto.pas unit, inside the ObjectInvoke
function. Note that we can still define methods with more than 32 parameters, and can
even import them using the proxy generator, but calling them will raise an exception.
This can be observed by writing the following Server Methods:
type
TMyServerMethods = class(TDSServerModule)
private
{ Private declarations }
public
{ Public declarations }
function EchoString(Value: string): string;
[TRoleAuth('admin')]
function ReverseString(Value: string): string;
function ServerTime: TDateTime;
function Add33(const X1,X2,X3,X4,X5,X6,X7,X8,X9,X10,
X11,X12,X13,X14,X15,X16,X17,X18,X19,X20,
X21,X22,X23,X24,X25,X26,X27,X28,X29,X30,
X31,X32,X33: Integer): Integer; // new
end;
Implementing it as follows:
function TMyServerMethods.Add33(const X1, X2, X3, X4, X5, X6, X7, X8, X9, X10,
X11, X12, X13, X14, X15, X16, X17, X18, X19, X20, X21, X22, X23, X24, X25,
X26, X27, X28, X29, X30, X31, X32, X33: Integer): Integer;
begin
Result := X1+X2+X3+X4+X5+X6+X7+X8+X9+X10+ X11+X12+X13+X14+X15+X16+X17+
X18+X19+X20+X21+X22+X23+X24+X25+X26+X27+X28+X29+X30+X31+X32+X33
end;
try
ShowMessage(IntToStr(
Server.Add33(1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,1,1,1))) //33 parameters
finally
Server.Free
end
end;
You will get an error message (at run-time) when calling the server method:
Note that this limit may be increased in the future, but for now is hardcoded as
MaxParams = 32 on line 413 in the System.ObjAuto unit. If 32 parameters are not
enough, you can always combine them as (more than 32) fields in a class (not in a
record), and pass that instance instead. I leave that proof as exercise for the reader.
If you created multiple project targets, then the easiest one to test right now is the
DataSnapServerVCL executable. However, when you run it, something else may pop-up as
well. Depending on your version of Windows and level of security settings, you may see a
Windows Security Alert to tell you that Windows Firewall has blocked some features of the
DataSnapServer application.
You can allow the DataSnap Server to communicate on one or more of your networks. In
the dialog below, I’ll allow the DataSnapServer to communicate over my private networks
at home and at work, but not at any public networks (for example using my wifi
connection), because these typically have very little security.
The Windows Security Alert dialog is shown due to the fact that the DataSnapServer
application is actively listening to incoming requests over TCP/IP and HTTP at this time.
Like a Trojan Horse, only one that we want to be able to listen to incoming requests, so
click on the button to Allow Access and continue to start the DataSnap Server.
Note that if the AutoStart property of the TDSServer component is set to False, then we
would not have immediately seen this dialog, but it would have popped-up (for the first
time) when we called the Start method of the TDSServer component.
Then, inside Delphi XE2, click on the Data Explorer tab of the Project Manager, and open
up the DATASNAP node of the dbExpress tree.
Select the DATASNAPCONNECTION item, or create a new one, right-click on it and select
“Modify Connection”. In the dialog that follows, we can define the protocol (like tcp/ip or
http), the host (by default localhost) and port number (default 211 for TCP/IP, 80 or 8080
for HTTP – note that the IDE cannot make a test connection using HTTPS). You may have
to specify a path if the server is deployed using IIS.
When all required information is specified, click on the Test Connection button to verify the
information. If you get an error message, then you have to fix something (note that the
IDE is not able to register an encryption filter at design-time). If the filters were added to
the transport components, then you may see the following error message from the Delphi
XE2 IDE:
Unfortunately, I do not know a trick to register the filters in the IDE (the DataSnap design-
time package itself is already loaded), so the only way to work with a DataSnap Server
that uses the encryption filters, is to remove them temporarily from the list of filters. This
can be done at startup of the Server Container class as follows:
{$IFDEF DEBUG}
DSTCPServerTransport1.Filters.Clear; // BS
{$ENDIF}
This will make sure the filters are cleared when the application is started in DEBUG mode.
And now we can make a connection from the Delphi XE2 IDE to the DataSnap Server:
Once we can make a valid connection, it’s time to close the Modify Connection dialog, and
open up the DATASNAPCONNECTION node in the dbExpress tree of the Data Explorer.
This will show a list of subnodes for Tables and Views (which both give a Remote error:
Dbx.MetaData is an unrecognized command type), Procedures (a long list), Functions
(empty) and Synonyms (which gives a message that “this method must be implemented in
a derived class).
The list of procedures is the only useful entry we get at this time. It starts with the
DSAdmin methods, and also includes DSMetaData methods (unless we’ve set the
HideDSAdmin property of the TDSServer component to True), and finally the
TMyServerMethod methods.
These include the seven AS_ specific IAppServerFastCall methods as well as the
EchoString, ReverseString and ServerTime methods.
We can select one of these methods, and right-click on it to “View Parameters”. This will
give a window where we can enter parameter values, and watch the result.
For example, for the ReverseString method, we can enter a parameter value, click on the
execute button (upper left corner) and see the result.
We can perform this trick with all exposed Server Methods (as long as the authorization
allows it, of course).
Also interesting to see is the CodeSite Live Viewer window with the CodeSite Trace
messages that display some information of the DataSnap server:
{$R *.res}
{$IFDEF DEBUG}
var
Destination: TCodeSiteDestination;
{$ENDIF}
begin
Application.Initialize;
Application.MainFormOnTaskbar := True;
{$IFDEF DEBUG}
CodeSite.Enabled := True;
CodeSite.Enabled := CodeSite.Installed;
if CodeSite.Enabled then
begin
Destination := TCodeSiteDestination.Create(Application);
Destination.LogFile.Active := True;
Destination.LogFile.FileName :=
ChangeFileExt(ExtractFileName(Application.ExeName), '.csl');
Destination.LogFile.FilePath := '$(MyDocs)\My CodeSite Files\Logs\';
Destination.Viewer.Active := True; // also show Live Viewer
CodeSite.Destination := Destination;
CodeSite.Clear
end;
{$ELSE}
CodeSite.Enabled := False;
{$ENDIF}
Application.CreateForm(TFormMain, FormMain);
Application.CreateForm(TServerContainer1, ServerContainer1);
Application.Run;
end.
This will ensure that CodeSite is only enabled if we compile the project with the DEBUG
build configuration, and even in that case CodeSite will only be used if it’s actually installed
on the server machine where we run the DataSnap Server application.
The above code will produce a logfile in the My Documents\My CodeSite Files\Logs with
the name of the application as filename. Apart from that, the CodeSite Live Viewer will
also be shown. If that’s not needed, then the line
Destination.Viewer.Active := True; // also show Live Viewer
{$R *.RES}
{$IFDEF DEBUG}
var
Destination: TCodeSiteDestination;
{$ENDIF}
begin
if not Application.DelayInitialize or Application.Installing then
Application.Initialize;
{$IFDEF DEBUG}
CodeSite.Enabled := True;
CodeSite.Enabled := CodeSite.Installed;
if CodeSite.Enabled then
begin
CodeSiteManager.ConnectUsingTcp; // for DataSnap services...
Destination := TCodeSiteDestination.Create(Application);
Destination.LogFile.Active := True;
Destination.LogFile.FileName := 'DataSnapServerService.csl'; // hardcoded name
Destination.LogFile.FilePath := '$(MyDocs)\My CodeSite Files\Logs\';
Destination.Viewer.Active := True; // also show Live Viewer
CodeSite.Destination := Destination;
CodeSite.Clear;
CodeSite.Send('DataSnap Server Service loaded')
end;
{$ELSE}
CodeSite.Enabled := False;
{$ENDIF}
Application.CreateForm(TServerContainer1, ServerContainer1);
Application.Run;
end.
You can make the FileViewer and LiveViewer logging optional (for example by reading
settings from an .ini file). In order to allow feedback, and test if the Windows Service is
able to talk to the CodeSite Viewers, we can add CodeSite logging messages to the Service
operation procedures, for example as follows (in the ServerContainer unit):
function TMyServerContainer.DoContinue: Boolean;
begin
Result := inherited;
CodeSite.Send('Service Continue');
DSServer1.Start;
end;
Finally, for the ISAPI DLL, we must add different code to allow the ISAPI DLL to send
messages to CodeSite. Writing to a logfile may fail, because the ISAPI DLL may not have
enough permission.
As a consequence, I typically allow only writing to the LiveViewer using the TCP/IP
protocol.
begin
CoInitFlags := COINIT_MULTITHREADED;
Application.Initialize;
{$IFDEF DEBUG}
CodeSiteManager.ConnectUsingTcp; // for local services (like IIS)...
CodeSite.Enabled := CodeSite.Installed;
CodeSite.Clear;
CodeSite.Send('DataSnapServerRESTISAPI loaded');
{$ELSE}
CodeSite.Enabled := False;
{$ENDIF}
Application.WebModuleClass := WebModuleClass;
TISAPIApplication(Application).OnTerminate := TerminateThreads;
Application.Run;
end.
For both the Windows Service and the ISAPI DLL, the CodeSite Dispatcher must listen to
incoming TCP messages. We can configure this in the Ports tab of the Dispatcher Settings
dialog.
Make sure to tell the firewall to allow messages through these ports as well, otherwise you
still won’t see any CodeSite messages (even when sent from the same machine).
Once compiled, we can install the DataSnapServerService project from the command-line
as follows:
DataSnapServerService.exe /install
This may show a dialog from the firewall, and also a message that the service was
installed successfully:
Note that the service, while set to Automatic, may not start right away. In order to start
the service for the first time we can use the Computer Management Console and start it.
Note that because the DataSnap Service is installed with the Automatic Startup Type, it
will start automatically when you reboot the machine.
Also note that while the DataSnap Service is running, you will not be able to recompile the
project (or deploy the compiled project to the location where DataSnapServerService.exe
is running), since the executable will be locked. You need to manually stop, recompile
and/or redeploy, and then restart the Service in order to replace the application. Client
connections will be reset, and all state will be lost of course when you replace the
DataSnap Service application.
Finally, be aware that the generated Delphi DataSnap Service project will produce Release
and Debug builds in different locations (the Release\Win32 and Debug\Win32 directories),
but if you’ve installed a service, it will keep on running it from that location (so switching
from Build Configurations will not cause the service itself to switch as well, unless you
make sure the actual Service executable is modified as well).
ISAPI Deployment
We can also deploy the DataSnap REST ISAPI DLL on Microsoft’s Internet Information
Services on Windows Server 2003 or 2008. This is also explained in some detail in an
article by Jim Tierney at http://blogs.embarcadero.com/jimtierney/2009/08/20/31502,
and a more recent article by John Kaster at http://edn.embarcadero.com/article/40873.
We know in advance that we will only need to deploy ISAPI extensions – and not CGI
executables – so you may decide to allow only the required extension here: All Unknown
ISAPI Extensions.
Once the ISAPI Extensions are enabled, we can create a virtual directory where the ISAPI
dynamic link library can be placed for deployment.
Virtual Directory
On Windows 2003, we need to deploy the ISAPI application in a virtual directory that has
the execute rights enabled. In the previous screenshot, open the Web Sites node and
select the web site for which you want to create the virtual directory. Then, right-click on
the web site node, and do New | Virtual Directory.
This will produce a Virtual Directory Creation Wizard that takes you through the steps to
create and configure a virtual directory.
You first have to specify the alias for the virtual directory. Specify Deployment for the
example from this courseware manual (feel free to use your own name for the virtual
directory, of course, but the dialogs will use Deployment as name).
The next step is the location for the virtual directory, for which I usually pick a directory
like c:\inetpub\wwwroot\Deployment, or a subdirectory in the root of the specific web site
I’m creating this virtual directory for. Note that the directory must exist, otherwise you will
get an error message telling you it doesn’t exist.
The last step in the Virtual Directory Creation Wizard dialog involves setting the
permissions for the virtual directory. For ASP.NET web services, you only need to enable
Read and Run scripts (such as ASP) and no Execute, Write or Browse rights. But for CGI
executables and/or ISAPI DLLs, you need to check the Execute rights option.
When the Execute option has been set, you can continue and finish the Virtual Directory
Creation Wizard.
Once the virtual directory is created, you can right-click on the virtual directory and select
properties to continue with the configuration, allowing you to set the default content pages
(by default set to index.htm, index.html and Default.aspx for example), but also the
Directory Security settings, and the virtual directory options themselves. The virtual
directory will contain Application Settings, in our case with Application name Deployment,
using the default application pool. It’s a good idea to use a special application pool for
certain virtual directories and applications, so they do not get in the way of other
applications.
You can use the Application Pools node (just above the Web Sites node) to configure
existing application pools and/or add new ones for your virtual directories and applications.
You can now place your ISAPI Extension project in the new virtual directory. For a
DataSnapServerRESTISAPI.dll placed in the Deployment directory, the URL to call this will
be http://localhost/Deployment/DataSnapServerRESTISAPI.dll where you can replace
localhost by the IP-address or – even better – the DNS name of the machine.
Assuming this was my own web server, and I just created the virtual directory
DataSnapXE2 on the www.bobswart.nl site (which can use HTTP as well as HTTPS), then
the result would be http://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll.
Note that if you do not have your own web server machine, there is a good chance you
need help from your ISP in order to create the virtual directory.
SSL Certificates
In order to secure the communication between the DataSnap web service and the clients,
it is highly recommended to install a SSL Certificate for the web server domain, so the web
service requests are done using https:// instead of the unsecure http://. Using https://
and SSL, the request and responses between a client and server are encrypted.
This means that the previous example http:// location for the DataSnap ISAPI REST
Server at http://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll is replaced
by the secure https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll
I always use the Dutch company Xolphin to purchase my SSL Certificates, see
www.xolphin.nl for more details.
In order to request a SSL Certificate, you need a Certificate Signing Request (CSR), which
can be obtained by using the Internet Services Manager and selecting the properties menu
for the website you want the certificate to be used for.
In here, we need to click on the Server Certificate button, which brings us to the next
screen.
If we click on Next on that screen, we can then select what kind of certificate we want to
request:
I assume you want a new certificate, so click on Next. Later, we’ll get back to this dialog in
order to import a certificate.
Prepare the request, but send it later indeed (as soon as this dialog is done).
On the next page, you can specify the name for the web site that you want the certificate
to be valid for. Note that this is not yet the DNS name, but it should be something that’s
good to remember.
On the next page, you can specify the name of your organisation and organisation unit,
and on the next page you can define the actual DNS name for the certificate.
This should be a real DNS name, like www.bobswart.net (and in real life, I have an SSL
certificate for the www.bobswart.nl domain).
After this, you can specify the geographical information, and finally you’ll get an overview
page, followed by a page where you can specify the filename for your request and the
request summary.
Your SSL Certificate request needs this file, so you can quickly receive the SSL Certificate
to install at your server (for the specified DNS web site).
Once the certificate is installed, you can make sure the Web Service is connected using the
secure https:// protocol instead of the unsecure http:// way. This will improve security but
also give your clients or customers a good impression. Highly recommended
Note that there is no support for HTTPS and SSL when building a stand-alone (WebBroker
Indy based) DataSnap Server application at this time.
This will ensure that only a HTTPS connection is accepted, unless the user Bob logs in.
From now on, the DataSnap server will only authenticate requests that use HTTPS and
SSL, and no “normal” and unencrypted HTTP requests.
Note that you only need to select the ISAPI Extensions, but the CGI extensions may be
nice for little tests (although DataSnap really requires ISAPI extensions).
In the above screenshot, both CGI and ISAPI Extensions have been selected. But in
practice, only ISAPI extensions will work with DataSnap, especially when returning
DataSets from the DataSnap Server. DataSnap CGI Servers are no longer possible with
Delphi XE.
Once IIS7 is installed correctly, and you can view http://localhost in a browser , it's time
to configure IIS and add the virtual directory.
From the Administrative Tools in the Windows’ Start menu, select the Internet Information
Services (IIS) Manager. Alternately, right-click on the My Computer icon and select
Manage. Then, inside the Server Manager dialog, open the Roles node and select the Web
Server (IIS) node. The subnode Internet Information Services (IIS) Manager will show you
the IIS Manager as well.
In the windows that follows, select your machine (WS2008 in my case), and then in the
middle of the screen select the "ISAPI and CGI Restrictions" icon (see screenshot on the
next page).
Double-click on the "ISAPI and CGI Restrictions" icon. In the overview that follows, click
on the "Edit Feature Settings..." action item on the right side. This gives a dialog, Edit
ISAPI and CGI Restriction Settings, and check the Allow unspecified CGI modules and/or
Allow unspecified ISAPI modules.
Once CGI and/or ISAPI Settings are possible, we can move down the tree on the left, and
select the web site where we want to deploy the application.
Right-click (in this case on the Default Web Site) and select "Add Virtual Directory"
In the dialog that follows, specify the alias (Deployment in this case) as well as the
physical path.
In our case, the physical path is c:\inetpub\wwwroot\Deployment. Make sure that the
directory exists (otherwise you will get an error message).
When the virtual directory is created, make sure to select it in the left side of the IIS
Manager window, and then in the middle pane select the Handler Mappings.
Click on the "Edit Permissions..." action item (or double-click on Handler Mappings), which
changes the screen to the Handler Mappings contents, which shows the CGI.exe and
ISAPI.dll features disabled by default.
Click on the "Edit Feature Permissions..." action item, and in the Edit Feature Permissions
dialog that follows check the Execute right item.
This completes the steps to enable CGI and/or ISAPI Extensions on IIS7. We can now
deploy DataSnap ISAPI extensions in the Deployment virtual directory on the default web
site.
Note that apart from a virtual directory – which is enough for CGI or ISAPI Extensions –
you can also create an application (using the “Add Application” context menu on the web
site). This gives you a similar dialog, but with a few more options such as an Application
pool that can be used to manage the CGI and ISAPI extensions in the application
directory.
This is mainly helpful when working on ASP.NET projects or the 64-bit version of IIS.
Click on the “Add Application Pool…” button to add a new application pool, specifically for
our 32-bit DataSnap ISAPI Servers. Make sure to give it a recognisable name (like
“DataSnap Pool”), leave the .NET Framework version and Managed pipeline mode set to
their default value (the DataSnap Server ISAPI DLLs will be unmanaged anyway), and click
on OK:
Now, right-click on the new DataSnap Pool entry in the list of application pools, and select
Advanced Settings.
This will give you a dialog where we can configure the new application pool in detail. The
option that we need to change here is the “Enable 32-bit Applications”. By default set to
False, we must set it to True to allow 32-bit DataSnap ISAPI DLLs to run.
Now, make sure the virtual directory and/or application uses this application pool.
Obviously, we should first click on the Select button to change the DefaultAppPool into the
DataSnap Pool here.
Then, we can deploy the compiled DataSnap REST Server ISAPI DLL to the
c:\inetpub\wwwroot\Deployment directory and connect to the DataSnap Server as
http://localhost/Deployment/DataSnapServer.DLL for example.
One way to deploy DataSnap Server applications is to set the output directory to the
location of the virtual directory (if you have access to it from the development machine).
In my case, the disk from the web server is shared as disk Z: in my local network (using
the network card behind the firewall), and the deployment directory for my DataSnap
REST ISAPI Server is Z:\Inetpub\wwwroot\Deployment as can be seen in the following
screenshot:
You may encounter an error message that Delphi cannot write to the destination
DataSnapServerRESTISAPI.dll, which can happen if the ISAPI DLL is still loaded by the
web server. Fortunately, using Application Pools, we can recycle the Application Pool (in
this case the DataSnap Pool) to ensure that the ISAPI DLL is unloaded again and can be
replaced after a new compile.
If you host more than one DataSnap ISAPI DLL, then you should make sure to use a
number of different Application Pools, for example one for the testing DataSnap servers
and one for the actual deployed DataSnap servers – or even one Application Pool for each
deployed DataSnap Server.
This will produce a dialog where we can select the new platform that we want to add. In
the case of DataSnap Server projects, the new choice is only 64-bit Windows:
The new choice will be turned into the active platform, and we can now compile Debug and
Release build configuration versions of the ISAPI DLL and deploy it with the 64-bit version
of IIS. Note that this time, we do not need a special Application Pool with the “Enable 32-
bit Applications” set to True (a special Application Pool is always a good idea, but this
property does not have to be set, since the new ISAPI DLL is 64-bit itself).
In the section on DataSnap and Databases, I will also explain details about the 32-bit and
64-bit versions of the MIDAS.DLL and the MidasLib unit (the latter will avoid all confusion
regarding the MIDAS.DLL).
Directory of \Inetpub\wwwroot\DataSnapXE2\css
Directory of \Inetpub\wwwroot\DataSnapXE2\images
Directory of \Inetpub\wwwroot\DataSnapXE2\js
Directory of \Inetpub\wwwroot\DataSnapXE2\templates
Note that the DataSnapServerRESTISAPI is in the root directory itself. And also note that
by default, we should not give write permission to any of these directories, including the js
directory Which means that the DataSnapServerRESTISAPI.dll will not be able to update
the ServerFunctions.js file. However, there is an easy workaround for that, since the
ServerFunctions.js file only needs to be updated when the DataSnapServerRESTISAPI.dll
file itself has changed. In that case, we should use the fact that we have two REST server
projects in the project group: a DataSnapServerRESTVCL.exe as well. And we can run this
stand-alone WebBroker server and allow it to regenerate the new ServerFunctions.js file.
And then deploy both the new DataSnapServerRESTISAPI.dll and the ServerFunctions.js
file to the server. That way, we do not need any security breach on the web server.
This also means you should modify the WebFileDispatcher1BeforeDispatch function in the
WebModule unit, to check if the application is an ISAPI DLL with the IsLibrary call.
procedure TWebModuleREST.WebFileDispatcher1BeforeDispatch(Sender: TObject;
const AFileName: string; Request: TWebRequest; Response: TWebResponse;
var Handled: Boolean);
var
D1, D2: TDateTime;
begin
Handled := False;
Note that there is also a proxy subdirectory, with currently (for Delphi XE2 Update #4)
contains no less than 6 subdirectories: csharp_silverlight, freepascal_ios42,
freepascal_ios50, java_android, java_blackberry, and objectivec_ios42. The two Free
Pascal directories are new, and can be used to create iOS DataSnap clients (which will also
be covered in more detail in my Delphi XE2 iOS Development courseware manual).
Clicking on “Open Browser” will start the server and open the browser. Here, we’ll see the
login page as well as a link to the Server Functions.
However, this "Server Functions" link is only visible if we connect from the localhost or
127.0.0.1.
So if you run the real deployed DataSnapServerRESTISAPI.dll from the web server (and
not from localhost or 127.0.0.1) at for example my server with the URL
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll then the Server
Functions link will not show up, and we will only be able to login and test the Reverse
String function.
In order to enable testing from a remote client machine as well, we need to modify the
AllowServerFunctionInvoker function in the Web Module unit, returning the IP address of
the server. For my fixed IP-address 88.159.206.154 (for www.bbobswart.nl) this is as
follows:
As a consequence, you can also see the "Server Functions" link if you connect to the
DataSnap REST server on my web server. You will need to make a similar modification to
the AllowServerFunctionInvoker if you want to be able to test the server functions from a
browser on a different machine as well. Note that this will allow any client using a browser
to test the server methods (like on my server), which may not be what you want to allow.
While testing the server methods, note that ReverseString will return “undefined” if we did
not have authorization enough to execute this method (in this case, if we didn’t login as
user “Bob” - see previous section for details).
Finally, note that any attempt to login using the browser with http:// instead of https://
will not fail with an error message (because the Authentication was expected to fail for
http:// requests on port 80). Unfortunately, when testing server methods via the browser,
the Protocol will be "datasnap" instead of the actual communication protocol http or https.
This is another reason why you should be careful not to open up all server methods to
anyone (via the browser) using the modified AllowServerFunctionInvoker function.
The login is for an Amazon EC2 account, which is not free, so it’s not easy to “play along”
unless you are serious about deployment on the cloud.
When you click on the Get Machines button, you must select the correct region for your
login:
The workings of the Deploy to the Cloud dialog is actually quite easy once you understand
what's happening behind the scenes.
The EC2 cloud machine it just a server, and you need to define a directory on that
machine where your files will end up being deployed to. So, if I want to deploy to a
directory called "Bob", I need to make a directory on the server, and make it shareable
under the name "Bob", and need to make sure Administrator can write to it.
When you login to the EC2 machine, that login will connect to the Windows Shared
directory (not very secure, I know) and upload the files. This may mean that you need to
disable the firewall for a while (in order to allow the incoming connection). I've done some
testing, and managed to use the "Deploy to the Cloud" dialog to actually deploy some
DataSnap servers.
However, there is a bit of an issue when trying to update DataSnap servers in the cloud.
When the target file at the server side is "locked" (i.e. when the file is running at the
server for example - a DataSnap Server in action), then the deployment will fail with an
error 32 "The process cannot access the file because it is being used by another process".
Reported as QC #88550 - Deploy to the Cloud - cannot upload if target file is running
(active)
This is a weak spot of the cloud deployment feature: you can deploy DataSnap servers,
but as soon (and as long) as they are in use, you cannot deploy a new version until the old
server has been stopped and the process terminated. Which is why I will probably stick to
manual deployment for my own servers.
Nevertheless, provided the EC2 cloud server has been configured correctly, the Deploy to
the Cloud dialog can certainly be a time saver for developers who want to benefit from
Amazon EC2 to deploy their DataSnap Server applications, shielding them from the low-
level deployment details.
The fact that the DataSnap Servers need to be stopped before you can replace them,
prevents me from mentioning that it's the fastest way for (re)deployment of DataSnap
servers as well.
When Delphi is installed, three 32-bit versions of the OpenSLL DLLs can be found on your
machine (actually all three slightly different):
- C:\Program Files\CollabNet
- C:\Program Files\Embarcadero\RAD Studio\9.0\bin\SubVersion
- C:\Program Files\FinalBuilder 7 XE2
Even on a 64-bit Windows machine, I could find no 64-bit version of OpenSSL, but you can
download a build from http://www.slproweb.com/products/Win32OpenSSL.html.
In the archive, you’ll find libeay32.dll and ssleay32.dll. Note that these are 64-bit versions
of the OpenSLL DLLs, having the same name as the 32-bit counterparts. Make sure they
do not end up in your 32-bit Windows directory. In fact, you may want to make sure that
they are in the directory of your 64-bit DataSnap Server and/or 64-bit DataSnap Client
only.
5. DataSnap Clients
When the first DataSnap Server demo is up-and-running and listening to incoming
requests, it’s time to build a client. In this section, I’ll explain how we can connect from
the client to the server, how to import the methods by generating server classes.
The components in the DataSnap Client category are the “old” DataSnap components,
which can still be used, but are no longer the recommended way of using DataSnap. There
is one exception: the new TDSProviderConnection component, which can be used to
connect an “old” DataSnap server to a new DataSnap client.
Instead of the DataSnap Client category, we must look at the dbExpress category, where a
new component called TSqlServerMethod can be found (note from the next screenshot
that this new component can be identified easily, since it’s the one with TSql in its prefix
instead of TSQL that the existing dbExpress components still have).
The TSqlServerMethod component can be used to call remote methods from a DataSnap
server, but it needs to connect to the DataSnap server first.
The connecting can be done using a TSQLConnection component – and not using one of
the older TxxxConnection components. In order to facilitate this, the TSQLConnection
component has a new driver name in the Driver drop-down list, called DataSnap.
So, place a TSQLConnection component on the ClientForm, and set its Driver property to
DataSnap. As a result, the Driver property will show a plus sign (on the left of the property
name), and we can expand the property to show all subproperties.
The TSQLConnection settings on the previous page can be used to connect to the stand-
alone DataSnap VCL server, running on the same machine (or on a different machine, if
you also change the HostName property).
We can also connect to the up-and-running DataSnapServerRESTISAPI application
deployed on my web server on the www.bobswart.nl domain using HTTPS. In order to
connect to this DataSnap server, we need to set the CommunicationProtocol property to
https, the Port to 443, the HostName property to www.bobswart.nl, the DSAuthUser
property to “Bob” (no password needed at this time), and finally the URLPath to
/DataSnapXE2/DataSnapServerRESTISAPI.dll
When you’re entered these settings correctly, you can test the connection by changing the
Connected property from False to True to see if you can connect to the DataSnap Server
on my web server:
Once you’ve verified that the connection can be made, we can right-click on the
TSQLConnection component and select the Generate DataSnap client classes option.
This will produce a new unit, by default called Unit1, with a class called
TMyServerMethodsClient (the actual name of the DataSnap Server methods class at the
server side with the “Client” part added to it). Save this unit in ServerMethodsClient.pas.
As you can see, there are two constructors for the TMyServerMethodsClient class, one
destructor and there should be the three server methods we defined at the DataSnap
server side.
In order to use these methods, add the ServerMethodsClient unit to the uses clause of the
ClientForm, place a TButton component on the client form, and write the following code in
the OnClick event handler:
procedure TForm1.Button1Click(Sender: TObject);
var
Server: TMyServerMethodsClient;
begin
SQLConnection1.Connected := True; // just in case...
Server := TMyServerMethodsClient.Create(SQLConnection1.DBXConnection);
try
ShowMessage(DateTimeToStr(Server.ServerTime))
finally
Server.Free
end;
end;
This code will create the instance of the TMyServerMethodsClient, then call the ServerTime
server method, and finally destroy the proxy to the DataSnap server again.
Note that we explicitly assign True to the Connected property of the TSQLConnection
component. This is done to ensure that the connection is opened if not already open. In
fact, it is highly recommended to set the Connected property of the SQLConnection
component to False at design-time. This will ensure that the TSQLConnection component
does not want to make a connection to the DataSnap server as soon as you open the
project in Delphi. With the Connected property set to True, an attempt to make such a
connection to the DataSnap Server will be done (and if the server is up-and-running this
may be no problem, and only cause for a slight delay, but if the server is not running, then
Delphi may be “frozen” for a little while, until you get an error message that a connection
could not be made).
If you compile and run the DataSnapClient project, we can click on the button to retrieve
the server time. This, however, will still give an error message. And if you forgot to set the
Connected property of the TSQLConnection component to False, you would have seen this
error message as soon as you started the DataSnapClient application.
The error message explains that the https protocol cannot be used right now, but only
after a TDBXCommunicationLayer class has been registered to handle the https protocol.
You will not get this error message if you used the default tcp/ip communication protocol,
by the way.
For the https (and http) protocol, we need to add a unit to the uses clause – somewhere in
the DataSnapClient project, to ensure the correct TDBXCommunicationLayer class is
registered. The unit we need is DataSnap.DSHTTPLayer, which contains the
TDBXHTTPLayer and TDBXHTTPSLayer classes to handle the HTTP and HTTPS
communication protocols.
With the DataSnap.DSHTTPLayer unit added to the uses clause, we can compile and run
the DataSnapClient project again. Clicking on the button will produce the messagebox with
the value of the current server time, as expected.
I leave it as exercise for the reader to test the EchoString and ReverseString method as
well in a similar way.
Tracing SQLConnection
It would be interesting to see what’s going on behind the scenes between the
TSQLConnection component at the client side and the TDSServer class TMyServerMethods
at the server side. We can trace the messages that the TSQLConnection at the client side
is sending and receiving by using a TSQLMonitor component. If we place one on the Client
Form, we need to connect its SQLConnection property to the TSQLConnection component,
and then we can implement the OnTrace event handler of the TSQLMonitor component.
Note that when you first create the OnTrace event handler to implement, the Delphi IDE
will complain about the TDBXTraceInfo type being unknown. We must add the units
Data.DBXCommon to the uses clause (QC #66883).
procedure TForm1.SQLMonitor1Trace(Sender: TObject; TraceInfo: TDBXTraceInfo;
var LogTrace: Boolean);
begin
end;
Once that the unit is added, we can use the TDBXTraceInfo. This is a packed record with
Message, TraceFlag, TraceLevel and CustomCategory fields. Message and CustomCategory
are strings, TraceFlag and TraceLevel are integers (or Tint32 actually).
DBXCommon.pas also defines two helpful classes, so show the possible values that
TraceFlag and TraceLevel can get:
TDBXTraceFlags = class
const
None = $0000; // Trace nothing
Prepare = $0001; // prepared query statements
Execute = $0002; // executed query statements
Error = $0004; // errors
Command = $0008; // command related operations
Connect = $0010; // connect and disconnect
Transact = $0020; // transaction commit, rollback
Blob = $0040; // blob access
Misc = $0080; // Miscellaneous
Vendor = $0100; // Vendor specific operations
Parameter = $0200; // TDBXParamter access
Reader = $0400; // TDBXReader operations
DriverLoad = $0800; // loading operations
MetaData = $1000; // Meta data access operations
Driver = $2000; // Driver operations
Custom = $4000; // Allow the trace handler to filter the TDBXTraceEvent
end;
TDBXTraceLevels = class
const
None = 0; // Trace nothing
Debug = 1; // Debug trace
Info = 2; // Information tracing
Warn = 3; // Trace warnings
Error = 4; // Trace errors
Fatal = 5; // Trace fatal errors
Highest = Fatal; // Trace for all levels
end;
With this in mind, we can implement the OnTrace event handler as follows (adding the
CodeSiteLogging unit to the uses clause):
procedure TDSClientForm.SQLMonitor1Trace(Sender: TObject; TraceInfo: TDBXTraceInfo;
var LogTrace: Boolean);
begin
CodeSite.SendMsg(TraceInfo.Message);
LogTrace := True;
end;
The LogTrace variable is used to specify that we’ve logged the event, so there’s no need
for the TSQLMonitor to continue logging it.
The Active property of the TSQLMonitor controls whether or not the tracing is active. This
can be set in the OnCreate event handler of the Form for example:
procedure TForm1.FormCreate(Sender: TObject);
begin
SQLMonitor1.Active := True;
end;
At a minimum, we need to set the Host (for example to localhost or a DNS name or IP
address) and the Port property. The Protocol is a choice between HTTP or HTTPS (there is
no support for TCP/IP in REST of course). Optionally, we can set the UserName and
Password properties (if Authentication is required) and the UrlPath if the DataSnap Server
is an ISAPI DLL deployed using IIS.
For the DataSnapServerRESTISAPI.dll on https://www.bobswart.nl this means we can
configure the TDSRestConnection component as follows:
Once the required properties are set, we can right-click on the TDSRestConnection
component and select “Test Connection” to see if we can make a connection to the
DataSnap REST Server.
Note that there is no property to actually make the connection “active”, although the
“Keep-Alive” value will make sure the connection is kept alive (warning: if you click on that
property, the value will be cleared).
After that, we can right-click on the TDSRestConnection component and select “Generate
DataSnap Client Classes”, which will produce the REST client classes in a unit called
ClientClassesUnit1, as follows:
type
TMyServerMethodsClient = class(TDSAdminRestClient)
private
FEchoStringCommand: TDSRestCommand;
FReverseStringCommand: TDSRestCommand;
FServerTimeCommand: TDSRestCommand;
public
constructor Create(ARestConnection: TDSRestConnection); overload;
constructor Create(ARestConnection: TDSRestConnection; AInstanceOwner: Boolean); overload;
destructor Destroy; override;
function EchoString(Value: string; const ARequestFilter: string = ''): string;
function ReverseString(Value: string; const ARequestFilter: string = ''): string;
function ServerTime(const ARequestFilter: string = ''): TDateTime;
end;
We can use this TMyServerMethodsClient class as follows inside a OnClick event for
example:
procedure TFormClient.Button1Click(Sender: TObject);
var
MyServerMethodsClient: ClientClassesUnit1.TMyServerMethodsClient;
begin
MyServerMethodsClient := ClientClassesUnit1.TMyServerMethodsClient.Create(DSRestConnection1);
try
ShowMessage(MyServerMethodsClient.ReverseString('abc'));
finally
MyServerMethodsClient.Free
end;
end;
Note that I now need to prefix the TMyServerMethodsClient with the name of the unit
(ClientClassesUnit1 in this REST case), since both the DBX Client Classes and the REST
Client Classes proxy generators created a class called TMyServerMethodsClient.
The first button OnClick implementation (for the DBX Connection) needed to be modified
as follows:
procedure TForm1.btnServerTimeClick(Sender: TObject);
var
Server: ServerMethodsClient.TMyServerMethodsClient;
begin
SQLConnection1.Connected := True; // just in case...
Server := ServerMethodsClient.TMyServerMethodsClient.Create(SQLConnection1.DBXConnection);
try
CodeSite.SendMsg(DateTimeToStr(Server.ServerTime));
ShowMessage(DateTimeToStr(Server.ServerTime))
finally
Server.Free
end;
end;
This illustrates the main difference between a REST and DBX connection. REST can only
use HTTP and HTTPS, while DBX can use TCP/IP as well. REST does not have to be
explicitly “connected”, while a DBX connection using TSQLConnection needs to be
connected in order to create the TMyServerMethodsClient.
Finally, we cannot apply DataSnap Filters on REST connections, but we can apply them on
DBX connections. As a consequence, the only security layer we can add for the
communication between DataSnap REST servers and clients is the use of HTTPS and SSL.
HTTP Tunneling
HTTP tunneling support for clients outside a firewall is supported, with the OnOpenSession,
OnErrorOpenSession, OnCloseSession, OnWriteSession, OnErrorWriteSession,
OnReadSession, and OnErrorReadSession events that help with Failover, Replication and
Load-balancing.
This will produce a FireMonkey cross-platform project that can be compiled for Win32,
Win64 as well as Mac OS X.
Do File | Save Project As and save the main form in ClientForm.pas and the project itself
as DataSnapClientFMX.dpr. The ClientForm will show up in the FireMonkey designer, which
gives the form a black border, only placeholders for the border icons, and no VCL designer
guidelines (since this is the FireMonkey designer).
Set the Caption of the Form to "DataSnap FMX Client", and the Position property to
poScreenCenter (others do not seem to work at this time).
Do File | New - Other, go to the DataSnap Server category of the Object Repository, and
this time select the DataSnap Client Module.
This will produce a wizard to automatically generate the import units and modules for a
DataSnap Client.
In the DataSnap Client Module wizard, we have to make a number of choices, starting with
a choice between a local or a remote server. The local server choice is not useful – even if
the server is running on the local machine, you can always enter localhost or 127.0.0.1 as
IP-address, so select Remote Server as your choice here:
In the next page, you can specify which kind of DataSnap Server to connect to. Based on
the DataSnap Server that is up-and-running, make your choice here:
The third page can be used to select the communication protocol to connect to the
DataSnap Server: TCP/IP or HTTP (note that HTTPS is not an option in this dialog):
On the last page, we can specify the host name, port, user name and password to connect
to the DataSnap Server, and press the “Test Connection” button to verify that all fields are
entered correctly.
Finally, we can click on the Finish button, which will produce two new units for our
DataSnapClientFMX project: ClientModuleUnit1.pas and ClientClassesUnit1.pas.
The TClientModule1 class has two properties that are important to use: InstanceOwner (to
specify if the client is the owner of the parameter instances) and MyServerMethodsClient,
which is of type TMyServerMethodsClient. The property MyServerMethodsClient has a get
method which is interesting to examine in a bit more detail:
function TClientModule1.GetMyServerMethodsClient: TMyServerMethodsClient;
begin
if FMyServerMethodsClient = nil then
begin
SQLConnection1.Open;
FMyServerMethodsClient:= TMyServerMethodsClient.Create(SQLConnection1.DBXConnection,
FInstanceOwner);
end;
Result := FMyServerMethodsClient;
end;
The field FMyServerMethodsClient is the placeholder field in the class, and as you can see,
if it doesn’t exist, a new instance will be created and returned for every subsequent call of
the get method (or use of the property).
Here are the three methods that were defined at the server side: EchoString,
ReverseString and ServerTime.
The ClientForm can use both the TClientModule (with the MyServerMethodsClient property)
and the TMyServerMethodsClient (for the definition of the server functions available to
call), but we have to add both the ClientModuleUnit and the ClientClassesUnit unit to the
uses clause of the ClientForm.
So, open ClientForm and press Alt+F11 to add both the ClientClassesUnit1 and the
ClientModuleUnit1 unit:
With these units added to the uses clause, we can now add a TEdit and a TButton to the
FireMonkey form. Set the Name property of the TButton to btnServerTime, and the caption
to “Server Time”.
Place another button on the form, call it btnReverseString, and set its Text property to
“Reverse Str”.
Now compile and run the DataSnap FireMonkey Client application (initially, as a Win32
client). After a click to the Server Time and Reverse String button, the output could look
as follows:
And this looks and feels as a completely normal Win32 application (which it is of course),
connecting to the DataSnap Server.
Compiling and running as a 64-bit Windows target should give no problems either.
Deployment on Mac OS X
Apart from 32- and 64-bit Windows, Delphi XE2 also offers native Mac OS X as target
platform. The IDE remains a 32-bit application, however, so running (launching) a Mac OS
X application from the IDE will require some tinkering. First of all, you need a Mac running
Mac OS X version 10.6 (Snow Leopard) or later. In order to deploy from your Windows
development machine to the Mac target machine, you also need to install a separate utility
called the Platform Assistant (paserver) on the Mac. The paserver can be found on your
development machine in the C:\Program Files\Embarcadero\RAD Studio\9.0\PAServer
directory. There is a setup_paserver.zip archive which can be used on Mac OS X. You need
the setup_paserver.zip and open it on the Mac where you need to unzip it and run the
setup_paserver application to launch the installer of the Platform Assistant Server
application.
A number of screens need to be passed (leaving the default settings intact), after which
the Platform Assistant Server is installed on the machine, in my case in the
/Users/bob/Applications/Embarcadero/PAServer directory:
Now, as soon as you start the paserver application itself, it will start a terminal window
and ask for a password.
This is the password that you can define here (at the deployment machine) and that you
have to specify at the development machine to allow the development machine access to
the deployment machine.
Once the password is entered, we can leave the paserver running at the Mac, and return
to the Windows development machine to compile and run (with or without debugging)
and/or deploy the project as a Mac OS X application.
Mac OS X Development
The next step back at Delphi XE2 involves the addition of a new target platform. In the
Project Manager for our DataSnapClientFMX project, open the Target Platforms node to
verify that it only contains the default 32-bit Windows target. Then, right-click on the
Target Platforms node and select “Add Platform…” to add a new target platform. In the
dialog that follows, select OS X as new target:
A Target Platform also needs a Remote Profile with the information to connect to the
remote target machine. If a remote profile for the selected platform already exists, then
Delphi XE2 will assign it as remote profile. Otherwise, you will be prompted to create a
new remote profile as soon as you try to run the project for OS X.
If you click on next, a dialog will show up with all available Remote Profiles for the OS X
Platform. Initially, this list will be empty, so you need to click on the “Add…” button to
create a new Remote Profile, or you need to click on the “Import…” button to import a
Remote Profile (in case you saved and exported one from another development machine
for example).
The Add button will display the “Create a Remote Profile” wizard, were we can specify the
name of the remote profile. I typically call the Mac profiles “Mac” followed by the last part
of the IP-address, so I know which Mac I’m talking about. Mac164 is the Mac mini running
OS X Snow Leopard that we just installed the paserver on:
Note the checkbox to use this Remote Profile as default remote profile for the OS X
platform, so next time you add a Target Platform for OS X, this Remote Profile will be
assigned to it automatically.
On the next page, we can specify the Host Name of the Mac OS X machine (or the IP-
address). The Port number is already specified using the default port number 64211. If
you debug and deploy in your own local network, that port number is fine. However, If you
plan to debug or deploy over the internet, I would change that port number to a slightly
less obvious one, since only the password is keeping other “visitors” from connecting to
your Mac and “deploying” applications to it.
Finally, do not forget to specify the same password here as the one you specified in the
paserver back on the Mac.
Click on the Test Connection button to ensure you can talk to the Mac. If the connection is
refused, then you may have to configure the firewall on the Mac to allow the incoming
connection.
If you try to connect to a remote profile for a Win32 or Win64 machine, you’ll get an error
message:
If you close this confirmation dialog and go to the next page in the Remote Profile Wizard,
a page is shown which is only relevant for C++Builder developers (in which case you need
to cache symbolic information from the remote machine on the local machine). Delphi
developers can just skip that page and click on the Finish button to create the remote
profile. This will bring you back to the Select Remote Profile dialog, where you can now
finally select the newly created Remote Profile.
As a result, the Project Manager will now show the Remote Profile next to the Target
Platform in the Project Manager:
Also, if the Remote Profile selection was shown as a result of the initiative to “Run” the
application, the actual application will now be running on your Mac OS X machine! This
may not be clear at first, especially if you did Run | Run without Debugging, but if you
switch back to the Mac, you’ll see the FireMonkey DataSnap Client application running as a
native Mac OS X application!
And this certainly looks like a native Mac OS X application to me. If you compare this
screenshot to the Windows edition of the FireMonkey Demo, you will see the same ListBox,
Edit and Button, but the actual look-and-feel is now totally different and Mac-like, while
the Windows edition truly feels like a Windows application. Of course, these are just simple
demo applications, but you should get the idea…
Mac OS X Deployment
When it comes to deployment of your application, especially to the Mac running OS X, you
need to perform a number of steps again. Using Project | Deployment you can get a
Deployment tab in the IDE that shows the files that need to be deployed to the Mac OS X
machine. There is one gotcha: if you compile a Release Build, then the project .rsm file will
not end up in the OSX32\Release directory, so this file cannot be deployed. The Debug
Build will produce the .rsm file, which causes it to be displayed in the list of deployment
files:
If you connect to the deployment machine, you can see the Remote Status. The green
arrow will actually deploy the project files. On my Mac, they end up in the
/Users/bob/Application/Embarcadero/PAServer/scratch-dir/Bob-Mac164 where Bob is the
name of the remote user (since the local user is called "bob") and Mac164 is the name of
my Remote Profile.
The DataSnapClientFMX demo is a 17.6 MB archive that contains all selected deployment
files, and can be run as stand-alone application on the Mac. We can also copy it to another
Mac (running at least Mac OS X Snow Leopard) and run it from there. If you copy it to a
USB stick, you’ll see a DataSnapClientFMX.app directory with a Contents subdirectory and
a MacOS as well as a Resources directory inside plus the Info.plist file (see list of files
above). The MacOS directory contains DataSnapClientFMX, DataSnapClientFMX.rsm and
the libcgunwind.1.0.dylib library (only 20 KB), while the Resources directory contains the
DataSnapClientFMX.icns file.
Note that DataSnap and Databases is supported by DBX only, so the TSQLConnection at
the client side. For the DataSnap client example we’ve built so far, we’ve used only the
SQLConnection component and the generated client classes directly, but we can also use
the two new DataSnap components TSqlServerMethod and TDSProviderConnection.
First, we need to ensure that the server actually exposes a TDataSet, so return to the
DataSnap server projects. Open up the ServerMethods unit and place a TSQLConnection
component on the TDSServerModule. Connect the TSQLConnection component to your
favorite DBMS and table (in my example, I will connect to the EMPLOYEE table from the
InterBase EMPLOYEE.GDB example database).
In order to do that, Set the Driver property of the TSQLConnection component to
InterBase. Then, open up the Driver property and set the Database property to the path to
the EMPLOYEE.GDB database. Make sure the LoginPrompt property of the TSQLConnection
component is set to False, and then try to set the Connected property to True to see if you
can open up the connection to the EMPLOYEE.GDB database.
object SQLConnection1: TSQLConnection
DriverName = 'InterBase'
GetDriverFunc = 'getSQLDriverINTERBASE'
LibraryName = 'dbxint.dll'
LoginPrompt = False
Params.Strings = (
'DriverUnit=Data.DBXInterBase'
'DriverPackageLoader=TDBXDynalinkDriverLoader,DbxCommonDriver160.' +
'bpl'
'DriverAssemblyLoader=Borland.Data.TDBXDynalinkDriverLoader,Borla' +
'nd.Data.DbxCommonDriver,Version=16.0.0.0,Culture=neutral,PublicK' +
'eyToken=91d62ebb5b0d1b1b'
'MetaDataPackageLoader=TDBXInterbaseMetaDataCommandFactory,DbxInt' +
'erBaseDriver160.bpl'
'MetaDataAssemblyLoader=Borland.Data.TDBXInterbaseMetaDataCommand' +
'Factory,Borland.Data.DbxInterBaseDriver,Version=16.0.0.0,Culture' +
'=neutral,PublicKeyToken=91d62ebb5b0d1b1b'
'GetDriverFunc=getSQLDriverINTERBASE'
'LibraryName=dbxint.dll'
'LibraryNameOsx=libsqlib.dylib'
'VendorLib=GDS32.DLL'
'VendorLibWin64=ibclient64.dll'
'VendorLibOsx=libgds.dylib'
'Database=C:\CodeGear\EMPLOYEE.GDB'
'User_Name=sysdba'
'Password=masterkey'
'Role=RoleName'
'MaxBlobSize=-1'
'LocaleCode=0000'
'IsolationLevel=ReadCommitted'
'SQLDialect=3'
'CommitRetain=False'
'WaitOnLocks=True'
'TrimChar=False'
'BlobSize=-1'
'ErrorResourceFile='
'RoleName=RoleName'
'ServerCharSet='
'Trim Char=False')
VendorLib = 'GDS32.DLL'
end
Leave the CommandType set to ctQuery, and click on the ellipsis property for the
CommandText property to enter an SQL query. For the first example, we should only
select the EMP_NO, FIRST_NAME, LAST_NAME, HIRE_DATE, and JOB_COUNTRY from the
EMPLOYEE Table.
Set the Active property of the TSQLDataSet component to True, to verify that no typing
mistake has been made in the CommandText property.
Then, make sure the TSQLConnection component has its LoginPrompt and Connected
property set to False at design-time, and ensure that the Active property of the
TSQLDataSet is also set to False again.
type
TMyServerMethods = class(TDSServerModule)
SQLConnection1: TSQLConnection;
SQLDataSet1: TSQLDataSet;
private
{ Private declarations }
public
{ Public declarations }
function EchoString(Value: string): string;
[TRoleAuth('admin')]
function ReverseString(Value: string): string;
function ServerTime: TDateTime;
public
function GetEmployees: TDataSet;
end;
As you can see from the TServerMethod1 define, the new function is called GetEmployees
and the implementation can be as follows:
Note that we should not close the TSQLDataSet here, since the dataset must remain open
for the client to be able to retrieve the records.
However, this may result in a situation where the TSQLDataSet was already opened (for
example by a previous call), and the call to "Open" will not have any effect. Including the
desired effect of moving to the first record in the dataset. In order to fix that potential
issue, we should first check to see if the TSQLDataSet is already active (opened), and
move to the first record if it is. Otherwise, just open it (which will also position the cursor
at the first record of course). Leading to the following modified implementation of the
GetEmployees function:
Recompile the server and make sure it’s running again. Note that if you are unable to
recompile it, then it’s most likely that the DataSnap server project is still running. This will
especially be the case with Windows Service targets (that need to be stopped) or ISAPI
DLLs (where you can best recycle the application pool).
Depending on the database, we must also deploy a dbExpress driver to the DataSnap
server deployment machine. In our case, using InterBase, we may get the following error
message from a deployment machine that does not have Delphi XE2 installed on it:
In case of an ISAPI DLL, we should not copy the dbxint.dll to the same virtual directory as
the DataSnapServerRESTISAPI.dll, since the ISAPI DLL will not look in that directory (it’s
not the working directory). Instead, we should copy dbxint.dll to the \Windows\System32
directory. And it may also be necessary to ensure the database client driver – GDS32.dll
for InterBase - is also placed in that location.
There is another DLL that should be deployed as well, the MIDAS.DLL. However, it may be
easier to just add the MidasLib unit to the uses clause of the project, thereby making sure
we never run into trouble when encountering different versions of the MIDAS.DLL on the
deployment machine. The ISAPI DLL will grow about 250 K in size, but with a size of
severak MBs already, this should be no big deal.
Note that the MidasLib is available for both 32-bit and 64-bit Windows applications, but not
for Mac OS X applications. This is not relevant for DataSnap Servers (which can only be
Windows applications at this time), but it will be relevant when creating cross-platform
DataSnap clients, as we'll see later.
TSqlServerMethod
Return to the DataSnap client application. The TSQLConnection component should not be
connected to the DataSnap Server, so the Connected property should be set to False. If
not, then make sure it’s set to False now, to reset the connection to the server (which has
been stopped and recompiled before restarted again). We can briefly set Connected to
True again, to refresh the information from the server, and can then regenerate the client
classes.
If you right-click on the TSQLConnection component and select “Generate Client Classes”,
then the unit ServerMethodsClient will automatically be overwritten. The generated
TMyServerMethodsClient class has been extended with the GetEmployees method:
type
TMyServerMethodsClient = class(TDSAdminClient)
private
FEchoStringCommand: TDBXCommand;
FReverseStringCommand: TDBXCommand;
FServerTimeCommand: TDBXCommand;
FAdd33Command: TDBXCommand;
FGetEmployeesCommand: TDBXCommand;
public
constructor Create(ADBXConnection: TDBXConnection); overload;
constructor Create(ADBXConnection: TDBXConnection;
AInstanceOwner: Boolean); overload;
destructor Destroy; override;
function EchoString(Value: string): string;
function ReverseString(Value: string): string;
function ServerTime: TDateTime;
function Add33(X1: Integer; X2: Integer; X3: Integer; X4: Integer; X5: Integer;
X6: Integer; X7: Integer; X8: Integer; X9: Integer; X10: Integer; X11: Integer;
X12: Integer; X13: Integer; X14: Integer; X15: Integer; X16: Integer;
X17: Integer; X18: Integer; X19: Integer; X20: Integer; X21: Integer;
X22: Integer; X23: Integer; X24: Integer; X25: Integer; X26: Integer;
X27: Integer; X28: Integer; X29: Integer; X30: Integer; X31: Integer;
X32: Integer; X33: Integer): Integer;
function GetEmployees: TDataSet;
end;
In order to use the GetEmployees method to retrieve the TDataSet, we can use a
TSqlServerMethod component from the dbExpress category of the Tool Palette. Place the
TSqlServerMethod on the client form, connect its SQLConnection property to
SQLConnection1.
Open up the ServerMethodName drop-down list to show all available methods we can call:
The list consists of a number of DSAdmin methods (which can be disabled by setting the
HideDSAdmin property of the TDSServer component to True), followed by 16 DSMetadata
methods (this used to be only 3), the Add33 method (to demonstrate the maximum
number of 32 possible parameters), 7 TMyServerMethods.AS_xxx methods (offering the
IAppServerFastCall interface) and finally the remaining TMyServerMethods EchoString,
GetEmployees, ReverseString, and ServerTime methods.
Finally, place a TDataSource on the client form and connect its DataSet property to the
TClientDataSet component, and after that we can use a TDBGrid and TDBNavigator
component, connect their DataSource property to the TDataSource component and be able
to view the data in the client form.
In order to verify the connection at design-time, we can set the ClientDataSet’s Active
property to True. This will toggle the Active property of the SqlServerMethod (only for a
short while to retrieve the data, after which the Active property it set to False again),
which will set the Connected property of the TSQLConnection component to True to enable
the connection between the DataSnap client and the server.
The connection remains active, by the way, and the result is both the TClientDataSet and
the TSQLConnection being active, showing the data inside the TDBGrid:
In order to view the data only at runtime, we need to make sure the TClientDataSet has
the Active property set to False, and the TSQLConnection also had its Active property set
to False again. Then, in the FormCreate, we can set Active to True or call the Open method
of the TClientDataSet, to view the data (only) at runtime.
This provides an easy way to view the data and look at it in a read-only way. Note that I
specifically say “read-only”, since the TSqlServerMethod does not allow the
TDataSetProvider-TClientDataSet combination to send any updates from the DataSnap
client to the DataSnap server this way.
Also note that the TSqlServerMethod can only connect to a TSQLConnection component
(i.e. using DBX over TCP/IP, HTTP or HTTPS) and not using a TDSRestConnection
component using HTTP or HTTPS.
TDSProviderConnection
If we want to apply updates, then we need a TDSProviderConnection component which
gives us a reference to a DataSetProvider at the server side, so we can not only read the
data, but also send the updates.
Note that the TDSProviderConnection can only work with DBX servers, and not with REST.
So like the TsqlServerMethod component, TDSProviderConnection can only connect to a
TSQLConnection and not to a TDSRestConnection component.
First, we need to make a change to the server data module at the server side, since we
only added a function to return the TDataSet, but now we must add an actual
TDataSetProvider and make sure it’s exported from the DataSnap server to the DataSnap
clients. So, return to the ServerMethods unit and place a TDataSetProvider component
next to the TSQLDataSet, pointing the DataSet property of the TDataSetProvider to the
TSQLDataSet. We should also rename the TDataSetProvider component, making sure that
it has a meaningful name like dspEmployees.
Now, recompile the DataSnap server and deploy it again (don’t forget to stop the Windows
Service or to recycle the application pool on IIS), so we can modify the client in order to
make changes to the exposed data.
Optionally, we can add a TSQLMonitor component here at the server side, and try to trace
the messages that are being passed between the TSQLConnection component and the
DBMS itself.
procedure TMyServerMethods.SQLMonitor1Trace(Sender: TObject;
TraceInfo: TDBXTraceInfo; var LogTrace: Boolean);
begin
LogInfo('OnTrace: ' + TraceInfo.Message);
LogTrace := False; // do not log further…
end;
The TSQLMonitor component works both at the DataSnap client side for a TSQLConnection
with a DataSnap driver, and at the DataSnap server wide with an InterBase connection.
TDSProviderConnection Client
To change the DataSnap Client in order to make use of the exposed TDataSetProvider
component, we have to remove the TSqlServerMethod and TDataSetProvider components
from the client form, but we need to add a TDSProviderConnection component instead.
Point the SQLConnection property of the TDSProviderConnection to the TSQLConnection
component, connecting to the DataSnap Server. We also need to enter a value for the
ServerClassName property of the TDSProviderConnection. It’s a bit unfortunate that we
(still) do not get a drop-down list here. Right now, we must manually enter the name of
the TDSServerModule, which was TMyServerMethods in our case.
In the previous example, the TClientDataSet only connected its ProviderName to the local
DataSetProvider1. However, using the TDSProviderConnection component, we must first
assign the RemoteServer property of the TClientDataSet to the TDSProviderConnection
component, and then select a new value for the ProviderName property (which was still
assigned to the DataSetProvider1 value by the way, but should now point to dspEmployees
– the more descriptive name of the TDataSetProvider component which was exposed from
the DataSnap Server).
The drop-down combobox for the ProviderName property should now show the
dspEmployees option (the name of the TDataSetProvider component which is exported
from the ServerDataMod unit).
Now, we can set the Active property of the TClientDataSet component to True again in
order to view live data at design-time:
Setting the Active property of the TClientDataSet component will also set the Connected
property of the TSQLConnection component to True. Note that it’s not a good idea to leave
these two properties set to True in the client form at design-time. First of all, if you open
the DataSnap Client project in the Delphi IDE, it will attempt to make a connection to the
DataSnap Server, which may fail if the server is not up-and-running. Second, if you start
the application at run-time, and no connection is available, then the application will also
fail with an exception. This prevents you from using the application on local data,
rendering it useless if no connection is present.
A better way is to include some menu option or button to explicitly connect the
TSQLConnection component and/or activate the TClientDataSet. This is also a great
moment to include username/password information, which we’ll cover later. For now,
make sure the Active property of the TClientDataSet is set to False, as well as the
Connected property of the TSQLConnection component. Then, place a button on the client
form, and in the OnClick event handler open the TClientDataSet explicitly.
procedure TForm1.btnOpenDBClick(Sender: TObject);
begin
ClientDataSet1.Open;
end;
Time to add some code to actually change the data (at the client side), and send the
changes back to the server.
Database Updates
There are actually two ways to send the changes we’ve made at the client back to the
server: automatic or manual. Both call the same method in the end, but the invocation is
either automatic or user-driven, both with advantages and disadvantages.
For the automatic approach, we can use the OnAfterPost and OnAfterDelete events of the
TClientDataSet, since these are the methods that have made changes to the data. In the
event handlers – which can be shared in a single implementation – we can call the
ApplyUpdates method of the TClientDataSet, sending the changes, also called “Delta” back
to the server to be resolved back in the database.
procedure TForm1.ClientDataSet1AfterPost(DataSet: TDataSet);
begin
ClientDataSet1.ApplyUpdates(0);
end;
ApplyUpdates takes a single argument, that controls if we’re using a transaction to apply
all updates or not. If we pass 0 or a positive value, then the applying of updates is done in
the context of a transaction. When no errors are encountered, a COMMIT is done.
However, if more than the passed number of errors are encountered (more than 0 in the
example above), then a ROLLBACK will occur, meaning that no updates will have been
applied. So passing 0 or higher means: all or nothing for the updates.
As an alternative, we can pass -1 to ApplyUpdates, which means: do not use a transaction,
and try to apply as many updates as possible, but return with the errors that were
encountered. This is helpful if you need to apply as many updates as possible, only
needing to worry about the few that failed (without having all updates rolled back).
The call to ApplyUpdates(0) is best used in cases where you need to apply all or nothing,
like a family trip by airplane, where you do not want little Kevin to stay behind. On the
other hand, ApplyUpdates(-1) is best in situations where the individual updates are
unrelated, like sending out e-mail messages or updates that need to be processed as soon
as possible without having to wait for the rest.
There is no good or bad: the best technique depends on the situation at hand. Just make
sure you know what it is that you need.
If something bad has happened (like a record no longer found) during the update, we can
get feedback in the OnReconcileError event of the TClientDataSet. For now, let’s just show
the error to the user. We’ll get back to handling these kinds of errors in more detail
shortly.
procedure TForm1.ClientDataSet1ReconcileError(DataSet: TCustomClientDataSet;
E: EReconcileError; UpdateKind: TUpdateKind; var Action: TReconcileAction);
begin
ShowMessage(E.Message)
end;
The advantage of this approach is that the user will never be able to “forget” to send an
update to the database. Which is important, because DataSnap is a multi-tier architecture,
which means that by definition the user looks at and works with data that might be
outdated (even if only by a few seconds) and hence deleted or modified by another user.
One of the disadvantages is that all insert, update and deletes are immediately posted to
the server, and there is no chance to perform an UNDO (especially since the updates
themselves can be updated again by other users). This is the main reason why some users
actually prefer a more manual approach that gives more control over updates and undo.
When the manual approach is used, then all changes are kept at the client side – inside
the memory of the TClientDataSet component. This allows the user to undo certain parts
of the changes: either the last change, a specific record or the entire pending updates.
Clicking on the “Apply Updates” button to explicitly call the ApplyUpdates methods when
the user is ready. The possible danger is that the user could “forget” to click on the update
button, so we should add a check to the form or application to prevent it from closing
when there are still changes left in the TClientDataSet. The latter can be checked by
looking at the ChangeCount property of the TClientDataSet.
In order to demonstrate all these undo features, we need three more buttons. One with
caption “Undo Last”, one with “Undo Current” and a final one with caption “Undo All” to
control the way we want to undo our not-yet-applied changes and updates.
Note that the UndoLastChange method has an argument named “FollowChange” which is
set to True. This means that if you undo the last change while currently positioned in a
DBGrid at the last record (and let’s assume the last change was in the first record), then
your current record in the DBGrid will be set to the first record – the one that was just
“undone”. Without passing True for FollowChange, there is a good chance that the user will
click on the “Undo Last” button, but doesn’t see anything being changed on screen. It’s
very tempting in that situation to click on the button again, thereby undoing another
change. And again, all with potentially work-destroying effect, since the user may not see
what he’s (un)doing. With the FollowChange parameter at least the user will end up seeing
the record that was reverted back to the previous state.
In practice, users prefer to use the Undo Current way of undoing changes, however. Which
means that if they’ve made several changes in a DBGrid, they can just select the record
they want to undo, and do not have to worry if this was the last change they’ve done or
not.
Finally, the option to cancel all updates can be useful if you just want to start all over
again (without having to restart the client itself).
All undo options are only available while the updates are still at the client side, i.e. as long
as the user didn’t explicitly click on the Apply Updates button (which will send the changes
to the server to resolve them in the real DBMS).
In order to demonstrate these features, we need two more buttons. One with caption
“Save”, and one with caption “Load” to handle the loading and saving of the state of the
TClientDataSet.
For the Load and Save buttons, we should first define a common constant to use as
filename. This constant can be placed inside the TForm, in the private section:
private
const
BriefcaseFile = 'Briefcase.xml';
With that BriefcaseFile constant in place, we can implement the Load and Save as follows:
procedure TForm1.btnLoadClick(Sender: TObject);
begin
ClientDataSet1.LoadFromFile(BriefcaseFile);
end;
Note that the SaveToFile method has an optional second argument to specify the format of
the data packet file: either binary (the default), XML or XML UTF-8. You do not have to
make a choice if the method can determine the format based on the filename: by default
the format is binary, unless the file has an .xml extension, in which case the content is
saved as XML (and not XML UTF-8 by the way, so if you want to use special Unicode
characters in your briefcase file, then you should explicitly save the file as dfXMLUTF8.
procedure TForm1.btnSaveClick(Sender: TObject);
begin
ClientDataSet1.SaveToFile(BriefcaseFile, dfXMLUTF8);
end;
Working offline can be useful when you do not have a connection to the DataSnap server
for whatever reason. You can still make changes to the TClientDataSet, save the changes
to a local briefcase file, and in a later session load the local briefcase file and apply the
changes back to the DataSnap Server (once available again).
Note that the BriefcaseFile should be removed after the updates have been applied
successfully, to avoid the user loading it again (with the changes that have already been
applied), and trying to call ApplyUpdates again which will then probably fail (because the
updates have already been applied). The ApplyUpdates method is in fact a function,
returning the number of errors that it encountered. So, if there were no errors, we should
see if the BriedcaseFile exists and delete it from the local disk, as follows:
procedure TForm1.btnApplyUpdatesClick(Sender: TObject);
begin
if ClientDataSet1.ApplyUpdates(0) = 0 then
if FileExists(BriefcaseFile) then
DeleteFile(BriefcaseFile)
end;
Note that any update errors will be reported in the OnReconcileError event handler of the
TClientDataSet, where we can also try to handle the specific issues.
The return value of the call to ApplyUpdates consists of the number of errors. When
passing 0, we’re only interested if this value is bigger than 0. In that case, no updates
have been applied at all. However, if we passed -1, then the number of updates still
pending is returned by the call to ApplyUpdates.
So what if two clients connect to the DataSnap Server, obtain the Employees data and
both make some changes to the first record. According to what you've build so far, both
clients could then send the updated record back to the DataSnap Server using the
ApplyUpdates method of their TClientDataSet component. If both pass zero as value for
the "MaxErrors" argument of ApplyUpdates, then the second one to attempt the update
will be stopped. The second client could pass a numerical value bigger than zero to
indicate a fixed number of errors/conflicts that are allowed to occur before the update is
stopped.
However, even if the second client passed -1 as argument (to indicate that it should
continue updating no matter how many errors occur), it will never update the records that
have been changed by the previous client. In other words: you need to perform some
reconcile actions to handle updates on already-updated records and fields.
Fortunately, Delphi contains a very useful dialog especially written for this purpose. And
whenever you need to do some error reconciliation, you can consider adding this dialog to
your DataSnap Client application (or write one yourself).
To use the one available in Delphi, just do File | New - Other, go to the Delphi Files
subcategory of the Delphi Projects in the Object Repository and select the Reconcile Error
Dialog icon.
Once you select this icon and click on OK, a new unit Vcl.RecError.pas is added to your
DataSnapClient project. This unit contains the definition and implementation of the Update
Error dialog that can be used to resolve database update errors.
This is an event handler with four arguments: first of all the TClientDataSet component
that raised the error, second a specific ReconcileError that contains a message about the
cause of the error condition, third the UpdateKind (insert, delete or modify) that generated
the error and finally as fourth argument the Action that you feel should be taken.
As Action, you can return the following possible enum values (the order is based upon
their actual enum values):
- raSkip - do not update this record, but leave the unapplied changes in the change log.
Ready to try again next time.
- raAbort - abort the entire reconcile handling; no more records will be passed to the
OnReconcileError event handler.
- raMerge - merge the updated record with the current record in the (remote) database,
only changing (remote) field values if they changed on your side.
- raCorrect - replace the updated record with a corrected value of the record that you
made in the event handler (or inside the ReconcileErrorDialog. This is the option in
which user intervention (i.e. typing) is required.
- raCancel - undo all changes inside this record, turning it back into the original (local)
record you had.
- raRefresh - undo all changes inside this record, but reloading the record values from
the current (remote) database (and not from the original local record you had).
The good thing about the ReconcileErrorForm is that you don't really need to concern
yourself with all this. You only need to do two things.
First, you need to include the ErrorDialog unit inside the DataSnap Client main form
definition. Click on the DataSnap Client Form and do File | Use Unit to get the Use Unit
dialog. With the Client Form as your current unit, the Use Unit dialog will list the only other
available unit, which is the Vcl.recerror unit.
The second thing you need to do is to write one line of code in the OnReconcileError event
handler in order to call the HandleReconcileError function from the ErrorDialog unit (that
you just added to your ClientMainForm import list). The HandleReconcileError function has
the same four arguments as the OnReconcileError event handler (not a real coincidence, of
course), so it's a matter of passing arguments from one to another, nothing more and
nothing less. So, the OnReconcileError event handler of the TClientDataSet component can
be coded as follows:
procedure TFrmClient.ClientDataSet1ReconcileError(DataSet: TClientDataSet;
E: EReconcileError; UpdateKind: TUpdateKind;
var Action: TReconcileAction);
begin
//ShowMessage(E.Message)
Action := HandleReconcileError(DataSet, UpdateKind, E)
end;
Of course, you can also build your own handling, where you just display the exception
message in E.Message and assign raRefresh to the value of Action, so all changes are
undone and the current values from the database are loaded into the TClientDataSet
instead.
Skip moves on to the next record, skipping the requested update (for the time being). The
unapplied change will remain in the change log. Cancel also skips the requested update,
but it cancels all further updates (in the same update packet). The current update request
is skipped in both cases, but Skip continues with other update requests, and Cancel
cancels the entire ApplyUpdates request.
Refresh just forgets all updates you made to the record and refreshes the record with the
current value from the server database. Merge tries to merge the update record with the
record on the server, placing your changes inside the server record. Refresh and Merge
will not process the change request any further, so the records are synchronized after
Refresh and Merge (while the change request can still be redone after a Skip or Cancel).
Correct, the most powerful option, actually gives you the option of customizing the update
record inside the event handler. For this you need to write some code or enter the values
in the dialog yourself.
Merge will allow the TClientDataSet to change the SQL for the update and delete
commands. By default, the WHERE clause of SQL UPDATE and DELETE contains all fields,
except for blob fields. However, it’s seldom needed to find a record by all fields. This will
lead to reconcile errors if for example one user wants to change the FirstName but another
user has changed the LastName. Since the WHERE clause of the UPDATE command will
contain the old value of the LastName (which has changed to some other value in the
database already), then the update will fail because the record cannot be found with the
original LastName value.
It may be clear that Merge is in fact very powerful, since it allows two people to merge
changes to the same record, as long as they do not modify the same fields (or the primary
key of course).
Instead of waiting for the Reconcile Error dialog to show itself, we can decided to use the
“Merge” capabilities right from the start. For this, we must open up the DataSnap Server
project again, and move to the ServerMethods unit. The TDataSetProvider dspEmployees
that we expose here has a very powerful property called UpdateMode. By default,
UpdateMode is set to upWhereAll, which defines the situation where all fields (except for
memo fields) are passed in the WHERE clause of UPDATE and DELETE commands.
Alternatives to upWhereAll are upWhereChanged and upWhereKeyOnly.
The latter is dangerous, since it will only look at the primary key and will allow anyone to
update a record and overwriting previous values (since no check is made to see if the
record has just been changed by another user).
The upWhereChanged option for UpdateMode is in my view the most powerful choice. It
ensures that the WHERE clause of the UPDATE command only contains the primary key as
well as all original values from fields that have changed (and are part of the update itself).
Fields that were not changed are not mentioned in the update part or in the WHERE
clause, which results in the ability to merge changes from different users.
For real-world projects, we often use the upWhereChanged value for the UpdateMode
property of the TDataSetProviders exposed from the Server Modules. We also often pass -
1 to the call to ApplyUpdates, resolving as many updates as possible, only leaving the user
with the updates to records that were not found or already changed by another user.
These situations usually need to be handled by the user refreshing the data, and re-
applying the updates after he made the changes again (but only for those records that
were failed to get updates the first time).
Permission to Update
Although it’s nice that the DataSnap Client offers the ability to automatically or manually
send updates to the DataSnap Server, this is not something that may be allowed for every
user. The Security section of this courseware manual already covered how to handle this,
so let’s add the specific details to the current example application.
For this, we need to modify the ServerContainer unit, or – for the ISAPI DLL – the web
module. Since the ISAPI DLL is the one being deployed, I’ll show the web module here to
complete the example (but feel free to make similar changes to your ServerContainer).
The component we need to use for this is the TDSAuthenticationManager, and we should
assign the role “admin” to at least one user, like “Bob”, in the OnUserAuthenticate event
handler implementation in the Server Container unit:
if Assigned(UserRoles) then // only if assigned...
if LowerCase(User) = 'bob' then
UserRoles.Add('admin')
else UserRoles.Add('guest');
This will ensure that user Bob gets the role “admin”, but all other users explicitly get the
role “guest”.
The next step involves adding and defining a new item for the Roles collection of the
TDSAuthenticationManager component, where we specify that for the operation / action
TmyServerMethods.AS_ApplyUpdates the role “admin” is authorized, but the role “guest is
explicitly denied.
You will need to recompile and redeploy the DataSnap Server before these settings have
effect, of course.
The defined TDSAuthenticationManager Roles will ensure that when we login as “Bob”, we
can call ApplyUpdates, but when we login using another name, we get an error message
when trying to apply the updates:
Where the name of the user is given as well (but not the set of roles or the name of the
operation / action).
Of course, it would be a good idea to make the TClientDataSet readonly for users with the
role “guest”, and to hide the button “Apply Updates” to make sure the user is not confused
into thinking that editing is allowed, only to be denied when trying to apply the updates.
Note that a different error message is given if we use the TDSRestConnection component,
in which case we'll get a HTTP/1.1 500 Internal Server Error in case of a failed authorize
call.
Stateless Clients
By default, the TDSServerClass component has its LifeCycle property set to Session, which
gives each client connection its own session – allowing the server to maintain the current
record and state for the client. Alternatives for Session are Server (one instance for all
client connections) and Invocation – an instance for each incoming client request, during
the lifetime of the request.
The invocation setting means that the DataSnap Server can handle more clients, since
request is handled by an instance which is then freed, allowing other clients to connect.
So, in order to become stateless, and more scalable, we need to set the LifeCycle to
Invocation (where by default it’s set to Session). We need to make this change in the
LifeCycle property of the TDSServerClass component in the Server Container unit (for the
stand-alone and DataSnap Service applications). The required property values should be
as follows:
object DSServerClass1: TDSServerClass
OnGetClass = DSServerClass1GetClass
Server = DSServer1
LifeCycle = 'Invocation'
end
The TDSServerClass component can be found on the Server Container Unit for the
DataSnap REST server application.
After having changed the LifeCycle from Session to Invocation, we can recompile and
redeploy the DataSnap application. Initially, we do not need to change the client. The
client can still connect to the server, collect the data from the server (the result from the
TSQLDataSet’s query), perform Load and Save on the local TClientDataSet, perform some
undo operations, and finally call the ApplyUpdates (after having edited existing records)
without problems.
However, while this is “stateless”, it’s not using the full stateless capabilities of DataSnap.
We still retrieve all records from the dspEmployees TDataSetProvider, which may take
some time if the SQL query in the TSQLDataSet produces a large number of records in the
resultset. It might be a better idea to request only a limited set of records initially, and tell
the server to send only a fixed number of records.
However, we can change the value of PacketRecords from -1 to 5 for example, and tell the
server to send data it packets of 5 records. If we show the data inside a DBGrid, and the
grid shows more than 5 records, then more packets will be requested until at least the first
page of the DBGrid is full. However, any further packets will not be sent yet, and need to
be retrieved on demand (by the client).
In order to experiment with this, open the ClientForm again, select the TClientDataSet
component and set PacketRecords to 5. This should ensure that we initially get only 10
records inside the grid in my example (where the grid is high enough to show 6 records,
so we need to packets of 5 records each).
Unfortunately, if you compile and run the DataSnap client, and activate the
TClientDataSet, we get the first packet of 5 records, followed by a Key violation error if the
grid attempts to get the next 5 records. This error may even show up automatically, if we
started the application by automatically opening the TClientDataSet. In that case, you may
first see some fatal error about the application:
The option "Check online for a solution and close the program", will cause a delay that you
do not want (the application DataSnapClient.exe can hardly be expected to be known by
Microsoft's database of "known issues"). And the option to "Debug the program" will start
a just-in-time debugger, which is usually Visual Studio (if you have that installed, for
example as part of Prism), which is also not a good idea.
If you select the "Close the program" option, it will continue, and show you the actual
error: Key violation:
The initial error message is a good reminder that we should never keep the TClientDataSet
(and/or TSQLConnection) active at design-time, nor open them in the FormCreate.
Instead, we should better use an explicit action from the user to open the data.
In this case, we can just place a TButton on the client form, and in the OnClick event open
the TClientDataSet (removing the ClientDataSet1.Open call from the FormCreate event
handler instead).
Now, if we run the application and click on the Open Database button, we'll get a similar
Key Violation error, but this time with more background information:
The key violation problem is caused by the stateless nature of the DataSnap ISAPI server,
in combination with your use of the PacketRecords property of the ClientDataSet
component.
When the ClientDataSet requests the next packet of records, then the DataSnap server
should somehow already know that the client has already received the first packet of
records. However, since the DataSnap server is now stateless, when the clients performs a
request for another data packet, the server dutifully sends a data packet - containing the
first 5 records again. If no primary key is defined, you will just get a duplicate data packet
at the client side (meaning you see two occurrences of each record), and if a key is
defined then you will actually get a key violation (since a record with that key already exist
at the client side).
Since the DataSnap Server cannot maintain the state information for the client, it's up to
the client itself to maintain it's own state (that shouldn't be a problem), and send it's own
state to the DataSnap Server at the time it requests the next packet of records. This is
done in the OnBeforeGetRecords event handlers of both the ClientDataSet (at the client
side - to send the state information) and the DataSetProvider (at the server side - to
retrieve the state information).
OnBeforeGetRecords
To get the next data packet with X records from the DataSetProvider component of a
DataSnap Server, the ClientDataSet component can use the OwnerData parameter of the
OnBeforeGetRecords event handler to specify a state information, like the key value of the
last record currently retrieved at the client side (or the number of records already
retrieved). Assuming we use the EMPLOYEE table from the InterBase Employee database
which has a EmpNo keyfield, we can put the value of the EmpNo field in the OwnerData
parameter as follows (note that I'm using a bookmark to make sure the current cursor
position at the client side isn't lost when we move to the last record in the ClientDataSet):
procedure TClientForm.ClientDataSet1BeforeGetRecords(Sender: TObject;
var OwnerData: OleVariant);
var
CurRec: TBookmark;
OldFetchOnDemand: Boolean;
begin
with Sender as TClientDataSet do if Active then // Client already active?
begin
OldFetchOnDemand := FetchOnDemand; // save
CurRec := GetBookmark;
try
FetchOnDemand := False;
Last;
OwnerData := FieldbyName('EMP_NO').AsString;
CodeSite.Send('TClientDataSet EMP_NO = ' + OwnerData);
GotoBookmark(CurRec)
finally
FetchOnDemand := OldFetchOnDemand; // restore
FreeBookmark(CurRec)
end
end
end;
The call to the Last method will generate a stack overflow if the FetchOnDemand property
of the TClientDataSet component is set to True (in that case, moving to the last record will
actually trigger the TClientDataSet component to fetch (on demand) all records, which will
fire this OnBeforeGetRecords event handler again, etc. until you finally get a stack error).
That's why we have to set FetchOnDemand to False just before moving to the Last record
and we should restore the original value of FetchOnDemand - which by default is set to
True.
At the DataSnap Server side, just before the DataSetProvider component sends the
requested data packet with X records, the BeforeGetRecords event handler is called,
including the OwnerData parameter that we assigned on the ClientDataSet side.
So, if a value is passed, the DataSetProvider can use that value to look for the record with
that key value, and move to the next one (so the dataset is positioned at the correct "next
record to retrieve"). This can be implemented at the TDataSetProvider on the Server
Methods unit (add the System.Variants unit to the uses clause for the VarIsNull and
VarIsEmpty functions):
procedure TRemoteDataModule.dspEmployeesBeforeGetRecords(Sender: TObject;
var OwnerData: OleVariant);
var
EMP_NO: Variant;
Begin
EMP_NO := OwnerData;
if not VarIsNull(EMP_NO) and not VarIsEmpty(EMP_NO) then
with Sender as TDataSetProvider do
begin
DataSet.Active := True; // just in case
CodeSite.Send('TDataSetProvider EMP_NO = ' + EMP_NO);
DataSet.Locate('EMP_NO', EMP_NO, []);
DataSet.Next // start at next record
end
end;
Note that it's really important that both the ClientDataSet and the DataSetProvider know
the semantics of the OwnerData parameter in the OnBeforeGetRecords event handler. Also
note that this parameter can get a different meaning for each ClientDataSet-
DataSetProvider pair, as long as the meaning matches (you don't want to pass the number
of client-side records only to try to interpret this value as a key value).
Finally note that it’s required to assign the value in the OwnerData OleVariant to a normal
Variant first, before trying to get the string or integer data our of it, otherwise you’ll get an
empty string (or a value 0) as result.
The result - after recompiling and redploying the DataSnap Server - is a DBGrid filled with
10 records, the combined set of two data packets, this time with unique keys (or at least,
this time it should be with unique keys):
If you look closely (to the screenshot on the previous page), however, you’ll notice a
strange phenomenon: some keys still seem to appear more than once. If you scroll
through the grid, you’ll see that they will disappear as soon as you go down and scroll up
again. It’s most likely a bug in the buffering of the DBGrid, although calling Refresh (or
Repaint) doesn’t seem to solve it.
Fortunately, there’s an easy workaround we can use: after we retrieve the data of the first
two packets (of 5 records each) by opening the TClientDataSet, we should explicitly
navigate to the first record, as follows:
procedure TForm1.btnOpenClick(Sender: TObject);
begin
ClientDataSet1.Open;
ClientDataSet1.First;
end;
This will ensure that we really see only the unique 10 records in the DBGrid.
There's one more issue to address here: with the FetchOnDemand property set to True,
the ClientDataSet would automatically (on demand) fetch the next data packet from the
server when this was needed. Just in case someone wants to set FetchOnDemand to False
at all times, you can explicitly request the next data packet with the GetNextPacket
method. This means that the TDBGrid would only show the first five records, by the way,
since the next 5 will not be fetched on demand (even if the grid would need them). We
would have to explicitly call the GetNextPacket method to get the next set of records to
appear.
Finally note that there calling the RecordCount property of the TClientDataSet will return
the number of records at the server, and not at the client.
Master-Detail
So far we’ve only looked at the EMPLOYEE table of the example database. But it’s time to
combine this table with another one, building master-Detail relations. I’d like to build the
relation between EMPLOYEE and DEPARTMENT, using the DEPT_NO field.
There are three ways to build a master-detail relation: we can define a server side JOIN,
we can connect two datasets at the server (generating nested datasets), or we can export
both tables and do the connection at the client side. Each of these solutions has their
advantages and disadvantages.
The result is a dataset with 18 fields and 42 records (all records from the original
EMPLOYEE table, with additional DEPARTMENT fields added to them).
To demonstrate this, return to the DataSnap Server application, open the ServerMethods
unit, and add a new TSQLDataSet component. Connect it to the SQLConnection
component, and add the SQL from one of the SELECT queries above to its CommandText
property. Don’t forget to toggle the Active property to True (to verify that the SQL is
correct) and back to False again. Don’t forget to ensure that the TSQLConnection
component has its Connected property set to False when it’s time to deploy the server.
Then, add a TDataSetProvider component, connect its DataSet property to the SQLDataSet
and give it a useful name, like dspEmployeeDepartment
The advantage of this approach is that the JOIN is really easy to construct, the DBMS does
all the work, and the DataSetProvider only has to export the result.
The downside is that the resulting dataset no longer comes from one table, so it’s hard to
do updates in this JOIN. In fact, we can generally only update one of the tables that are
used for a JOIN, and we have to use the OnGetTableName event of the TDataSetProvider
to return the name of the table to send the updates to. Assuming we only want to update
the fields from the EMPLOYEE part of the JOIN, we should set TableName to EMPLOYEE, as
follows:
procedure TMyServerMethods.dspEmployeeDepartmentGetTableName(Sender: TObject;
DataSet: TDataSet; var TableName: WideString);
begin
TableName := 'EMPLOYEE';
end;
That will ensure that the SQL UPDATE statement, generated by the TDataSetProvider will
use EMPLOYEE as table name (and of course we should ensure that only fields that belong
to this table are updated – possibly enforced by configuring the other fields to be read-
only). Of course, you could decide to update only the DEPARTMENT fields in a similar way,
but the TDataSetProvider can only generate a single UPDATE command, for one table.
Another disadvantage is that all records are merged, and it’s not easy to quickly see all
EMPLOYEES that belong to a single DEPARTMENT (unless you sort the records by
DEPT_NO, but even then you’d have to look and count for yourself).
To demonstrate this effect, just recompile and redeploy the DataSnap Server with the
dspEmployeeDepartment TDataSetProvider, and switch back to the DataSnap Client
project in order to view the resulting data.
In order to view the data from the new TDataSetProvider, all we need to do is change the
ProviderName property of the TClientDataSet component on the client form. Right now, it
should be assigned to dspEmployees, but if we open up the drop-down listbox for the
ProviderName property, we can also select dspEmployeeDepartment now. Make sure
PacketRecords is set to -1 to retrieve all records (even when in stateless mode).
If you now take a look at the data at design-time, it’s not easy to find all employees of
department number 600 for example (even when sorting by DEPT_NO):
On the Server Methods unit from the DataSnap Server, we already have two TSQLDataSet
components, one with the SQL property set to
SELECT EMP_NO, FIRST_NAME, LAST_NAME,
HIRE_DATE, JOB_COUNTRY FROM EMPLOYEE
Both SQL commands need a little changing in order to produce server-side nested
datasets.
First of all, we need a common field to connect the two SQL result sets, and this field is the
DEPT_NO field. As a consequence, we need to adjust the first SELECT command to ensure
that the DEPT_NO field is included:
SELECT EMP_NO, FIRST_NAME, LAST_NAME, DEPT_NO,
HIRE_DATE, JOB_COUNTRY FROM EMPLOYEE
Second, we need to change the JOIN in the second SQL command to just a SELECT fields
FROM the DEPARTMENT table, including at least the DEPT_NO field. We can do this with a
simple
SELECT * FROM DEPARTMENT
Of course you can remove the * and list only the fields you really need. In fact, for real-
world projects, I would recommend to explicitly list the field names that you want to use.
The advantage of listing the field names explicitly is that you do not get extra data when
the table grown with extra fields (unless you need these fields, but then you would
probably also need to change the GUI to display the fields as well). The disadvantage is
that it may be a bit harder to explicitly list all fields, and you may get an error if the
database is restructured and fields are renamed (but in that case it doesn’t hurt to check
all your SQL commands anyway). For the demos in this courseware manual, I leave it as
exercises for the reader to explicitly list the required fields in the SELECT commands, but
will generally use the shorted SELECT * FROM to illustrate some points.
Then, we need a TDataSource component connected to the master table, which in this
case is the TSQLDataSet that returns the DEPARTMENT records. Give it the name
dsDepartments, and connect its DataSet property to the TSQLDataSet that returns the
DEPARTMENT records.
To make things a bit more maintainable on the Server Methods unit, make sure to assign
sensible names to the TSQLDataSet components– like sqlEmployees and sqlDepartments.
Up until this point, it was enough to just assign a useful name to the TDataSetProviders,
since these were the ones exported from the DataSnap server to clients, but now we must
actually hook up TSQLDataSet components, so it’s a good idea to give them a useful name
as well.
The next step involves pointing the DataSource property of the sqlEmployees component
to the dsDepartments TDataSource. So we’re telling the sqlEmployees to get the value of
the :DEPT_NO parameter from (the current record in) the sqlDepartments dataset.
Since the dspEmployeeDepartment TDataSetProvider is still pointing its DataSet property
to the master sqlDepartments TSQLDataSet, this means the TDataSetProvider now
exposes not only the master TSQLDataSet, but also the detail records from the
sqlEmployees TSQLDataSet as nested dataset. The employee records will be placed inside
a datasetfield, as we’ll see shortly.
The benefit of this nested-dataset approach is that this query passes a 2-level deep
dataset from the DataSnap Server to the client, where we can update both the server and
the client records. And because the result is nested, we have no problem identifying the
number of employees that belong to a certain department.
Recompile and redeploy the DataSnap Server again. At the DataSnap client side, the
TClientDataSet was already connecting to the dspEmployeeDepartment, showing 42 joined
records in the old situation. In the new situation, we see far less records:
We can right-click on the TClientDataSet component and select the Fields Editor. In the
fields editor, we can right-click again and select “Add All Fields” to show all available fields.
The exported TDataSetProvider dspEmployeeDepartment exported eight master fields:
DEPT_NO, DEPARTMENT, HEAD_DEPT, MNGR_NO, BUDGET, LOCATION, and PHONE_NO
from the DEPARTMENT table, and a sqlEmployees field that holds the detail EMPLOYEE
records for this particular department.
We can use a second TClientDataSet, with the DataSetField property connected to the
sqlEmployees persistent field, in order to get our hands on the detail records. This also
means that we can use two different TDataSource and TDBGrid components to show the
contents of the master and the associated detail records.
Note that any changes in the detail TClientDataSet (for the Employee records) cannot be
posted to the server using ApplyUpdates on that detail TClientDataSet, but have to be
posted using the master TClientDataSet instead. This may lead to somewhat more
complex code. Although on the other hands it simplifies things a bit, since you only need
to call ApplyUpdates on the master TClientDataSet - the one connected to the
TDataSetProvider on the server side.
We can still have Undo operations on the detail dataset, but the Load, Save and
ApplyUpdates can only operate on the master. So we can Save and Load the entire nested
dataset with the list of Departments and connected Employees. And we can send all
updates from the Departments and connected Employees, calling ApplyUpdates on the
master TClientDataSet.
To complete the example, we can add three new buttons to implement the Undo
operations on the detail TClientDataSet.
procedure TForm1.btnUndoLast2Click(Sender: TObject);
begin
ClientDataSet2.UndoLastChange(True)
end;
However, most customers will not want to use different buttons to act on different grids.
Instead, when they click on the Undo Current button, they expect the current record from
the last active grid to be undone. I leave it as exercises for the reader to solve this puzzle.
A downside of the nested dataset approach is that it takes a little bit longer to construct
and configure the three components (two TSQLDataSets and one TDataSource) compared
to the single TSQLDataSet we needed for the SQL JOIN. But other than that, and
especially if you want to be able to update data, the nested dataset works much better.
Client-Side
A third way to define the master-detail is completely at the client side. For this, we can re-
use the dspEmployees that we already had on the Server Methods unit of the DataSnap
Server project – modifying it to return all records from the EMPLOYEE table. This means
that we need to remove the parameter from the SQL, and also clear the TDataSource,
since we now need to export all records from the Employees table.
Then, one more TSQLDataSet and a TDataSetProvider component. Since we already have
a TSQLDataSet called sqlDepartments, let’s call the new one just sqlDepartment (non
plural). The TDataSetProvider can be called dspDepartments.
For this, we need to point the MasterSource property of the cdsEmployees to the
TDataSource connected to the cdsDepartment dataset, and then we can use the
MasterFields property to connect the two TClientDataSets using the DEPT_NO fields.
The advantage is that we only have to export two separate SQLDataSet components from
the server, without difficult JOIN or master-detail at the server side. Both datasets are
received at the client side, and can be updated individually.
The downside is that we have to define the master-detail relationship at the client side.
Which doesn’t take a lot of work, but it requires that the full detail dataset is fetched
before the master-detail relationship can be made (even if you only want to see one
master record, since you need to examine all records from the Employees table to be able
to determine which ones are the detail or not).
Another potential disadvantage is that you can no longer save the complete contents of
the master-detail relationship in one briefcase file using the SaveToFile method of the
TClientDataSet. With the nested dataset example, you only have to call SaveToFile for the
master TClientDataSet, but with the client-side master-detail relationship both the master
and detail TClientDataSet must be saved if you want to use the briefcase model (covered
later).
The main difference with this approach compared to the nested dataset approach is that
we now can (and need to) call ApplyUpdates on both TClientDataSets, and not just the
master, also need to explicitly open both TClientDataSets, and not just the master one,
and finally we need to save both TClientDataSets to a briefcase instead of only the master
one.
Personally, I prefer the use of nested tables. Especially since we can save the nested
datasets in one big file using the SaveToFile method, and use that as our single briefcase.
With the client-side master-detail approach, we need to individual files.
AutoIncrement Fields
Sometimes, you work with database tables that have fields that get an automatically
incremented value, the so-called AutoIncrement fields. In this section, I'll demonstrate
how we can define and use AutoIncrement fields (SQL Server or Blackfish SQL) with Delphi
and DataSnap.
SQL Server
First of all, we need a sample table. Using SQL Server 2008, I’ve used the following SQL to
create a new table with an identity key field:
CREATE TABLE dbo.Demo
(
DemoID int NOT NULL IDENTITY (1, 1),
DemoName nvarchar(50) NULL
) ON [PRIMARY]
GO
ALTER TABLE dbo.Demo ADD CONSTRAINT
PK_Demo PRIMARY KEY CLUSTERED
(
DemoID
) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON,
ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
ALTER TABLE dbo.Demo SET (LOCK_ESCALATION = TABLE)
GO
Note that we need a tool like Microsoft SQL Server Management Studio to create this
table, since the Data Explorer does not support creating “identity” fields, which is the
equivalent of an auto-increment field in SQL Server.
BlackfishSQL
For BlackfishSQL, there is unfortunately no example table in the Employee database that
uses an AUTOINCREMENT field, so we have to define one for ourselves. For Blackfish SQL,
the SQL DDL to create a similar table is as follows:
CREATE TABLE DEMO (
DEMOID INTEGER AUTOINCREMENT NOT NULL,
DEMONAME VARCHAR(50)
)
ALTER TABLE DEMO ADD PRIMARY KEY (DEMOID)
We can use the Data Explorer for this table, since the type “INTEGER AUTOINCREMENT” is
selectable from there.
Note that BlackfishSQL no longer ships with Delphi XE or XE2, but you may still have it on
your system as a result of previous installations of Delphi 2007, 2009 or 2010.
InterBase
InterBase does not support autoincrement fields. You can simulate them by using a
generator in combination with a trigger, which is beyond the scope of this DataSnap
courseware manual, and left as exercise for the reader.
The general principle of how to use autoincrement key values in DataSnap will still apply to
InterBase tables.
DataSnap Server
Once we have the sample table(s) with the AutoIncrement field, it's time to build the
DataSnap Server. Do File | New - Other, and create a DataSnap Server application using
the Wizard from the Object Repository.
Place a TSQLConnection component on the ServerMethods Unit, and connect it to the
database you want to test. Then, place a TSQLDataSet component, connect it to the
TSQLConnection component and specify the following SQL statement for its CommandText
property:
SELECT DEMOID, DEMONAME
FROM DEMO
If you then use the Fields Editor on the TSQLDataSet to add all fields, you’ll notice that the
DEMOID field is of type TIntegerField, and not a TAutoIncrementField. This is no problem,
but we have to change one setting for the DEMOID field in order to make this example
work: the TIntegerField.ProviderFlags.pfInUpdate sub-property needs to be set to False.
Now, place a TDataSetProvider on the ServerMethods Unit, and give it a sensible name
(since it will be exported), like dspDemo. Then, connect it to the TSQLDataSet component.
DataSnap Client
Run the DataSnap Server application, so we can connect to it from a DataSnap Client
application. We can use a regular VCL Forms Application for our DataSnap client. Place a
TSQLConnection component on the form, and connect it to the DataSnap Server by setting
the Driver to DataSnap and the appropriate subproperties of the Driver property.
Place a TDSProviderConnection component and a TClientDataSet component and connect
these to the dspDemo (TDataSetProvider) from the DataSnap Server side.
Use the Fields Editor to create and examine the fields DEMOID and DEMONAME. You may
notice – to your surprise – that DEMOID is again required. We must set the Required
property to False here. Remove pfInUpdate flag on DEMOID on TSQLDataSet and
TClientDataSet.
Note that we also must need the Refresh method in order to retrieve the value of the
DEMOID value that the DBMS assigns to this AutoIncrement field. The call to Refresh may
no longer be needed when the poPropogateChanges flag of the TDataSetProvider Options
property is (finally) implemented, one day. But until that day, we cannot refresh a single
record and must call Refresh on the entire ClientDataSet.
Master-Detail
In order to use this technique in combination with master-detail relationships, we cannot
leave the primary key field empty. In that case, we need to assign a unique temporary
value, such as a negative value.
For this, we need to implement the OnNewRecord event of the local TClientDataSet, as
follows:
procedure TForm12.ClientDataSet1NewRecord(DataSet: TDataSet);
const {$J+}
ID: Integer = -1;
begin
DataSet.FieldByName('DEMOID').AsInteger := ID;
Dec(ID);
end;
Note that this typed constant will start at -1 and get decreased by 1 for every new record
we add (during the "run" of the application).
First of all, if it’s an existing DataSnap Server application that you want to migrate, and
not just the remote data module, you need to unregister the DataSnap server by running
the executable from the command-line with the /unregister command-line option. If you
don’t do that right from the start, you will not be able to unregister the remote data
module from the registry (unless you can restore a backup of the project later).
In the unit for the remote data module, we must remove the code from the initialization
section. If you want to keep your unit compatible between Delphi 2007-or-below, and
2009-or-later, you can place this code inside {$IFDEF}s as follows:
{$IF CompilerVersion >= 20}
initialization
TComponentFactory.Create(ComServer, TRemoteDataModule2010,
Class_RemoteDataModule2010, ciMultiInstance, tmApartment);
{$IFEND}
end.
We should also remove the UpdateRegistry routine from the project, or place it in
{$IFDEF}s as well.
{$IF CompilerVersion >= 20}
class procedure UpdateRegistry(Register: Boolean;
const ClassID, ProgID: string); override;
{$IFEND}
The most important change – to turn the project into a COM-less DataSnap server –
involves the removal of the type library (or .ridl files) and the type library import unit.
These cannot be left in {IFDEF}s, so if you need to keep a Delphi 2007 or below (COM-
enabled) and Delphi 2009 or later (COM-less) version of the DataSnap server you need to
make a copy of the project now. We should use a TDSServerClass component in the
DataSnap server application and return the TRemoteDataModule class, just as we’ve done
before.
Finally, we should make sure that all custom methods that were added to the
TRemoteDataModule are moved from the protected section (the default in COM-enabled
DataSnap) to the public section (so method info is generated in the COM-less DataSnap
architecture).
For the DataSnap Server, we must now also deploy the database drivers. Which drivers
and files depends on the database you selected. When using DBX4, make sure to check
the TSQLConnection component as well as the dbxconnections.ini and dbxdrivers.ini files
which can be found in the C:\Documents and Settings\All Users\ Documents\RAD
Studio\dbExpress\9.0 directory on Windows XP or in the C:\Users\Public\Documents\RAD
Studio\dbExpress\9.0 directory on Windows Vista and Windows 7.
The dbxdrivers.ini file will specify – for the given Driver – the DriverPackageLoader and
MetaDataPackageLoader (usually pointing to the same package).
Finally, make sure to add the MidasLib unit to the uses clause of both the DataSnap Server
and DataSnap Client applications, so you do not have to deploy the MIDAS.DLL as
separate DLL with the DataSnap applications.
All filters are derived from the abstract class TTransportFilter class, which is defined as
follows in unit Data.DBXTransport, and contains three abstract methods that need to be
implemented at least: function Id, ProcessInput and ProcessOutput:
public
property Parameters: TDBXStringArray read GetParameters;
property UserParameters: TDBXStringArray read GetUserParameters;
property PublicKeyCryptograph: Boolean read IsPublicKeyCryptograph;
property ServerInstance: Boolean read IsServerInstance write SetServerInstance;
property ClientInstance: Boolean read IsClientInstance;
property FilterCollection: TTransportFilterCollection read GetFilterCollection
write SetFilterCollection;
Using the OnConnect event handler of the TDSServer component in the DataSnap Server
application, we can examine the registered filters that are used for the connection, for
example:
procedure TServerContainer1.DSServer1Connect(
DSConnectEventObject: TDSConnectEventObject);
var
i: Integer;
begin
CodeSite.EnterMethod('DSServer1.OnConnect');
CodeSite.Send('ConnectProperties: ' +
DSConnectEventObject.ConnectProperties.Properties.Text);
if Assigned(DSConnectEventObject.ChannelInfo) then
begin
CodeSite.Send('ChannelInfo.Id: ' + IntToStr(DSConnectEventObject.ChannelInfo.Id));
CodeSite.Send('ChannelInfo.Info: ' + DSConnectEventObject.ChannelInfo.Info);
end;
if Assigned(DSConnectEventObject.Transport) then
begin
CodeSite.Send('Transport: ' + DSConnectEventObject.Transport.ToString);
CodeSite.Send('Transport.BufferKBSize: ' +
IntToStr(DSConnectEventObject.Transport.BufferKBSize));
for i:=0 to DSConnectEventObject.Transport.Filters.Count-1 do
CodeSite.Send('Transport.Filter: ' +
DSConnectEventObject.Transport.Filters.GetFilter(i).Id);
end;
CodeSite.ExitMethod('DSServer1.OnConnect');
end;
This will produce a useful list of all filters that are registered for the DataSnap server (as
soon as a client connects to the server).
In this example, all three built-in filters have been added to the TDSTCPServerTransport
component.
ZlibCompression Filter
The ZLibCompression Filter ships with Delphi XE2 already, and can be used to compress
the data stream between the DataSnap Server and DataSnap Client (and vice versa). Both
the TDSTCPServerTransport component (for TCP/IP transport) and the DSHTTPService
component (for HTTP and HTTPS transport) have a Filters property that holds a
TTransportFiltersCollection of TTransportFilterItems.
We can click on the ellipsis for the Filters property to edit the collection of filters.
We can add a new TTransportFilterItem, and then use the Object Inspector to set the
FilterId and some optional properties.
For the ZLibCompression filter, the Properties property will get a default value assigned to
CompressMoreThan=1024, which means that only data packages bigger than 1024 bytes
will get compressed. Otherwise, there's a good chance the compression will have no
positive effect.
We can double click on the Properties to get a Value List Editor where the value of
CompressMoreThan can be modified:
Note that apart from the Filters property at the TDSTCPServerTransport component at the
server side, we should also specify that we want to use this filter at the client side (to
compress the outgoing requests and decompress the incoming responses). For this, we
only need to add the Data.DbxCompressionFilter unit to the uses clause of the ClientForm.
That will automatically register the TTransportCompressionFilter and make sure it’s used to
communicate with the server.
If you do not add the Data.DbxCompressionFilter unit to the uses clause, then running the
client will raise an exception with the message “Communication filter ZLibCompression is
not registered. Filter class needs to be registered in order to communicate with the
server.”.
Note that if you accidentally added a filter at the server side without specifying a FilterId
value, then the client will get error “Connection closed gracefully”, and cannot connect.
Encryption Filters
The encryption filters work only when combined. This means we have to add both the RSA
and the PC1 filters to the transport component. The RSA filter is used to encrypt the key of
the PC1 filter, while the PC1 filter is used to encrypt the data itself.
The RSA filter is implemented by the TRSAFilter class in unit Data.DBXRSAFilter, and has
the following properties:
The UseGlobalKey property can have the value "true" or "false". The KeyLength
determines the length of the key (generated by a call to the TRSACypher's class method
LoadSSLAndCreateKey, which receives both the KeyLength and KeyExponent values as
argument).
Internally, the TTransportCypherFilter uses a CypherKey (to encrypt the data) as well as a
ConfederateKey (to decrypt the data). It uses the TPC1Cypher sealed class to encrypt the
data one byte at a time using the Cypher method. These Setup and TearDown methods
are used to create an instance of the TPC!Cypher class, as follows:
var
Cypher: TPC1Cypher;
I: Integer;
begin
Cypher := SetUp(FCypherKey);
for i := 0 to Length(Data) - 1 do
Data[I] := Cypher.Cypher(Data[I]);
TearDown(Cypher);
Result := Data;
Log Filter
As a demonstration of producing custom DataSnap filters, I have implemented a special
Log filter to allow us to log the (amount of) data being sent from the client to the server
and vice versa.
This is possible because DataSnap is open to allow us to define our own transport filters.
We can do this by deriving a new class from the TTransportFilter type. The TTransportFilter
base type is an abstract class derived from TBaseTransportFilter, with three abstract
functions that we must override and implement before we can actually use the filter.
type
TTransportFilter = class abstract(TBaseTransportFilter)
public
function Id: UnicodeString; virtual; abstract;
function ProcessInput(const Data: TBytes): TBytes; virtual; abstract;
function ProcessOutput(const Data: TBytes): TBytes; virtual; abstract;
The Id function must return a unique name for our filter. The ProcessInput method takes a
Data TBytes parameter and returns a TBytes value as well – we can simply echo the input
data if we don’t want to do anything specific with it. Finally, the ProcessOutput method
also takes a Data TBytes parameter and returns a value of the same TBytes type.
Based on these three abstract methods, plus the constructor and destructor, we can write
a very small LogFilter that will report the number of bytes that have been passed from the
client to the server and back.
The definition of my TLogFilter is as follows:
unit LogFilter;
interface
uses
SysUtils, DBXPlatform, DBXTransport;
type
TLogFilter = class(TTransportFilter)
public
constructor Create; override;
destructor Destroy; override;
const
LogFilterName = 'Log';
For the implementation, we can again use CodeSite, and this time I’ve used the optional
message type arguments to display filter messages using special colors in the CodeSite
LiveViewer.
The Constructor and Destructor will have a blue icon, while the input filter messages will
be yellow and the output filter messages will be orange. We can combine this for example
with DataSnap Server messages in red and DataSnap Client messages in green to help us
easily identify which message is coming from which tier.
implementation
uses
CodeSiteLogging;
constructor TLogFilter.Create;
begin
inherited Create;
CodeSite.Send(csmBlue,'TLogFilter.Create');
end;
destructor TLogFilter.Destroy;
begin
CodeSite.Send(csmBlue,'TLogFilter.Destroy');
inherited Destroy;
end;
In the ProcessInput and ProcessOutput messages I do not just copy the Data, but I also
obtain the Length of the Data (in bytes) and I try to convert the Data into an AnsiString
using TEncoding.ASCII.GetString of the Data. This works for obtaining information like
method calls and parameter values, but not for Unicode data or encrypted data which is
sent over the transport using the filter.
function TLogFilter.ProcessInput(const Data: TBytes): TBytes;
begin
Result := Data; // log incoming data
CodeSite.Send(csmYellow, 'ProcessInput ' + IntToStr(Length(Data)),
TEncoding.ASCII.GetString(Data));
end;
The Id function should return the name of the filter, in this case stored in the
LogFilterName constant.
function TLogFilter.Id: UnicodeString;
begin
Result := LogFilterName;
end;
Finally, in the initialization code we need to register the filter, and in the finalization
section we need to unregister the filter. This will ensure that DataSnap client applications
only have to add the LogFilter unit the uses clause in order to automatically use the filter.
initialization
TTransportFilterFactory.RegisterFilter(LogFilterName, TLogFilter);
finalization
TTransportFilterFactory.UnregisterFilter(LogFilterName);
end.
In order to be able to pick this Log filter at design-time, we need to install it in the Delphi
XE2 IDE, which is why I’ve created a DataSnapLogFilter package. However, if you don’t
want to install it at design-time, you do not need to. We can also register and use it at
run-time.
LogFilter Client
For the client, we only have to add the LogFilter unit to the uses clause, but for the
DataSnap server (if you didn't add the Log filter at design-time) we need to add the
LogFilter unit to the uses clause of the ServerContainer unit, and also write two or three
lines of code in the OnCreate event of the Server Container unit.
procedure TServerContainer1.DataModuleCreate(Sender: TObject);
begin
CodeSite.Send('TMyServerContainer.Created');
DSTCPServerTransport1.Filters.AddFilter(LogFilterName);
DSHTTPService1.Filters.AddFilter(LogFilterName);
end;
The first call to AddFilter will register the LogFilter (using the LogFilterName constant) for
the TCP/IP transport using the TDSTCPServerTransport component. The same will happen
for the HTTP transport using the second call to AddFilter.
The unit LogFilter is about the smallest you can write DataSnap filters, but it’s fully
functional. In fact, we can see some interesting facts when running the DataSnap Server
and Client with the LogFilter registered.
As you can see, the number of bytes being sent back and forth are the same (note: if you
combine the LogFilter with the CompressionFilter the numbers will no longer match).
Some of the text fields show up encrypted, but only half of them. This is because the
filters are run “in order”. And the order is encryption (RSA + PC1) followed by the LogFilter
for the messages from the server to the client, and LogFilter followed by encryption for the
message from the client to the server. This can be seen in the above screenshot, where
each pair of encrypted strings starts by a message (sent) from the server and is followed
by the same message (received) by the client.
If we remove the RSA and PC1 filters, then the log information will always be unencrypted,
showing names of methods and their parameters. Using the ZLibCompression filter in
combination with the LogFilter will again show garbage for message pairs from one side
(depending on the order of the LogFilter and the ZLibCompression).
In a similar way, we can see what happens if we open the TClientDataSet with the nested
table approach. Especially in the stateless approach where we also call the
OnBeforeGetRecords event handler before the TMyServerMethods.AS_GetRecords method,
possibly using CodeSite to log the OwnerData value as well (see previous section).
Note that the buffer size at the DataSnap server side should be matched by the buffer size
at the DataSnap client side. For the DataSnap client, we can specify the buffer size in the
TSQLConnection component. After selecting Datasnap as Driver name, the BufferKBSize is
available as subproperty (with a default value of 32 as well). Make sure that this value
matches the BufferKBSize value at the server side for best results.
When performance is an issue, you should consider not using DataSnap Filters for HTTP
connections (which include ISAPI DLLs), but only for fast TCP/IP connections.
REST Calls
Using DataSnap XE2, we can make REST calls using a special syntax. If the URL to the
DataSnap Server is https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll,
then we can add /datasnap/rest to this URL, followed by the name of the Server Method
class, the method, and the arguments.
For example, to pass the string argument "Test" to the predefined ReverseString server
method of the TMyServerMethods class that we implemented earlier, we need to add
/datasnap/rest/TMyServerMethods/ReverseString/Test to the above URL, resulting in
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/datasnap/rest/TMy
ServerMethods/ReverseString/Test
Unfortunately, the DataSnap REST ISAPI Server deployed on my web server is password
protected. So when we call the above URL, we’ll see an error
Once logged in, the browser will remember our credentials (for the remainder of the
browser session), and we can call DataSnap server methods as REST functions in the
browser, like the Reverse String function:
However, in order to test the DataSnap REST capabilities passing the direct URL like
/datasnap/Rest/TMyServerMethods/ReverseString/Bob, we need to remove the login
checks.
For the ServerTime method, there is no authentication check, so we can safely call the URL
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/datasnap/rest/TMy
ServerMethods/ServerTime with the following results:
The same can be done for server method EchoString, which can be called directly using
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/datasnap/rest/TMy
ServerMethods/EchoString/Test
The generic syntax of making a REST request to a REST-enabled DataSnap Server written
in Delphi XE2 is as follows:
http[s]://server/datasnap/rest/<class>/<method>/<parameters>
Calling this REST-enabled URL results in a JSON result, as we saw in the previous page.
The result is a JSON construct:
{"result":["2010-11-15 20:26:38.107"]}
We can also call the /EchoString method. This one expects arguments, which we can pass
after the method name itself (in the right order: in this case only one). So the result of the
call to
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/
datasnap/rest/TMyServerMethods/EchoString/This%20is%20a%20test
(with spaces encoded by the browser to %20), will be another JSON value with the
following contents:
{"result":["This is a test"]}
We can also try to call the GetEmployees method, although it does not return a value that
can be seen in the browser (but rather a HTTP 500 Internal Server Error, or worse if the
browser is allowed to display the actual error).
Calling any of the IAppServer methods will also fail, for example when calling the
AS_GetRecords method passing the dspEmployees as argument, as specified in the URL
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/datasnap/rest/TMy
ServerMethods/AS_GetRecords/dspEmployees we will get an error message
JSON Serialization
Now that we’re at the JSON topic, it’s a good time to realize that the DataSnap Server
Methods in Delphi 2010 could not return just any type. Only the types that are supported
by both the client and server, and are streamable as JSON objects. User-defined objects,
such as custom classes had to be explicitly streamed to JSON objects, before we could use
them.
This was changed in Delphi XE, where we can pass almost any type, including TObjects,
but not records or pointers.
Before we can implement the new EchoStringEx method, we should first see how the
callback method can be defined at the client side (after all, it is a client method which can
be called by the server).
At the client side, we must declare a new class, derived from TDBXCallback, and override
the Execute method.
type
TCallbackClient = class(TDBXCallback)
public
function Execute(const Arg: TJSONValue): TJSONValue; override;
end;
Inside the Execute method, we get the Arg argument of type TJSONValue, which we can
clone and then get our hands on the actual contents. Using CodeSite.Send, I'm sending
the value to the CodeSite viewer (at the server).
The Execute method could also return a TJSONValue itself, so I’m just return the same
value again.
function TCallbackClient.Execute(const Arg: TJSONValue): TJSONValue;
var
Data: TJSONValue;
begin
Data := TJSONValue(Arg.Clone);
CodeSite.Send('Callback: ' + TJSONObject(Data).Get(0).JSonValue.value);
Result := Data
end;
For this example, the callback method will show the value that was passed to the
EchoStringEx method, before the method actually returns (i.e. while the method is still
being executed).
The implementation of the EchoStringEx method at the server side should now put the
string value inside a TJSONObject and pass it to the callback.Execute method, as follows:
Note that the callback function is executed (at the client side) – and will return – before
the actual EchoStringEx method finished at the server side.
Finally, the call to the EchoStringEx method at the client side also needs to change, since
we now need to pass a callback class – an instance of our new TCallbackClient - as second
argument.
var
MyCallback: TCallbackClient;
begin
MyCallback := TCallbackClient.Create;
try
ShowMessage(
Server.EchoStringEx(Edit1.text, MyCallback));
finally
// MyCallback.Free;
end;
end;
Note that we should NOT call Free in the finally-end clause, since the TDBXCallback class
uses reference counting to know when it’s time to release it. We should not free it. If you
call MyCallback.Free, then you will get access violations if you try to create another
instance of the TCallbackClient.
In fact, we can write it shorter, creating the TCallBackClient on the fly, as follows:
begin
ShowMessage(
FServer.EchoStringEx(Edit1.Text, TCallBackClient.Create));
end;
You can use the ReportMemoryLeaksOnShutdown to verify that this does not give a
memory leak.
This simple example demonstrates how to use "thin" client-side callback methods in
DataSnap.
The limitation should be clear: callbacks could only be placed as long as a server method
was being executed, which isn’t really useful in most cases (except for examples where the
call to the login method can last the entire session, and callbacks can be sent during the
session).
const
ChannelName = 'MyCallbackChannelName';
implementation
end.
For the callback client, we need a regular VCL Forms Application, so create a new VCL
Forms Application, and save it as CallbackClient, with the form in ClientForm, adding the
ChannelID unit to the project and the uses clause of the ClientForm unit.
The ChannelName is the name for the channel, but we also need a unique CallbackID for
each client listening to a callback channel. This can be a username, but only if it’s 100%
certain that a user will never use more than one DataSnap Client at a given time. A more
certain way to produce a unique name is using a GUID. This can be done in the
FormCreate, where we create a new GUID and assign it to the Caption of a TLabel
component so we can see it and use it when needed.
procedure TFormClient.FormCreate(Sender: TObject);
begin
CallbackID.Caption := CreateClassID
end;
Now, we need two buttons: one to register a callback, and one to unregister the callback
again. We also need a third button in order to send messages over the callback channel
(which is bi-directional, not just for the server to send callback messages to the client, but
also for the client to send callback messages to other clients!). And finally, we need a
TMemo component to display the received callback messages from the server and other
clients.
In order for the client to connect to the server, using REST for this first example, we need
a TRESTConnection component on the form.
We’ll configure it to connect to the DataSnap Server shortly, but first we need to add a
special component to the protected section of the client form: a ChannelManager of type
TDSRestClientChannel.
And since callbacks operate in different threads, we also need a procedure QueueLogValue
to handle incoming values (from callback methods) that will be used to update the GIU. In
short, add a private procedure and a protected field to the client form as follows:
private
{ Private declarations }
procedure QueueLogValue(const AValue: string);
protected
{ Protected declarations }
ChannelManager: TDSRestClientChannel;
Of course, here we can do anything else with the data that arrived with the callback
function, but for now we just put the incoming messages in the TMemo.
ChannelManager.Connect(callback);
if ChannelManager.Connected then
begin
ChannelManager.RegisterCallback(callback);
btnRegisterCallback.Enabled := False
end
end;
Note the anonymous method being used for the callback function: right now, a very simple
one that takes a value and a data type, and we’re only using the value to add to the
TMemo control (done in a thread-safe way inside the QueueLogValue).
We may want to add some error handling in case the ChannelManager could not be
connected (so the callback isn’t really registered). In this case, we only disable the
Register button if the call to RegisterCallback succeeded, so we’re not tempted to try to
register the callback channel twice.
There is also a function called ChannelBroadcast, which is implemented as follows, and can
be called at any time to broadcast the ChannelID over the Channel itself:
procedure ChannelBroadCast(AChannel: TDSRestClientChannel);
var
LMessage: TJSONString;
begin
if AChannel <> nil then
begin
LMessage := TJSONString.Create(Format('%s %s',
[AChannel.ServerChannelName, AChannel.ChannelId]));
try
AChannel.Broadcast(LMessage);
finally
LMessage.Free;
end;
end;
end;
This is a global function, and can be added to a different unit, for example the ChannelID
unit (since it’s not part of this specific DataSnap Client but more generic in nature.
The btnAlarm click will make use of this generic ChannelBroadCast procedure, sending an
alarm to other clients:
procedure TFormClient.btnALARMClick(Sender: TObject);
begin
ChannelBroadCast(ChannelManager);
end;
We should also be able to unregister the callback, which can be done as follows, by
clearing the ChannelManager:
procedure TFormClient.btnUnregisterCallbackClick(Sender: TObject);
begin
FreeAndNil(ChannelManager);
btnRegisterCallback.Enabled := True
end;
Note that we can enable the btnRegistrationCallback again after the ChannelManager has
been freed (in case we need the callback later).
Freeing the ChannelManager should also be done when we close the form, to ensure the
server knows that the client is gone (and the callback channel closed).
procedure TFormClient.FormClose(Sender: TObject; var Action: TCloseAction);
begin
FreeAndNil(ChannelManager)
end;
Save the application in project CallbackServer, and the form in unit MainForm. Make sure
to add the ChannelID unit to the project and the uses clause of the MainForm as well.
We should first turn the MainForm into a DataSnap Server application. For this, we only
need two components: a TDSServer and a TDSHTTPService. Note that although we’ve
used a TDSServerClass component in combination with some kinds of Server Methods unit
in all previous examples so far, this is not really required. The only component that we
always need is the TDSServer component, plus some kind of transport component (in this
case the TDSHTTPService).
Set the AutoStart property of the TDSServer component to True, and assign the
TDSServer component to the DSServer property of the TDSHTTPService component. Next,
make sure to assign a unique value to the HttpPort property of the TDSHTTPService
component (the default HttpPort 80 may not be available on your machine, since IIS and
Apache already listen to that port).
Now, we need a number of buttons, a TEdit and a TListbox control to work with the
callback channels from the connected clients. The TEdit will contain the message we want
to send to one or all clients, the TListBox will contain the registered callback channels (i.e.
the clients that we can call), and the buttons will be used to obtain the list of registered
callback channels, to send a message to all clients or a specific client, and to disconnect all
callbacks (for example in case the server wants to close).
in order to get the list of callback channels that have been registered with the TDSServer
(that the client connects to), we can call the GetAllChannelCallbackID function, which will
return a TList of strings. We can also call the GetAllChannelClientID function, which will
return the list of client ID’s that can be used to get our hands on an individual channel
(one that was registered by one client – these ID’s need to be unique, remember?).
To get the list of Callback channels, and place them in a TListBox, we can do the following:
procedure TServerForm.btnCallbacksClick(Sender: TObject);
var
clients: TList<String>;
client: String;
begin
ListBox1.Clear;
clients := DSServer1.GetAllChannelCallbackId(ChannelName);
try
for client in clients do
ListBox1.Items.Add(client)
finally
clients.Free
end
end;
This will return a list of GUIDs, one (unique) for each DataSnap client that registered itself
for the callback functions.
In order to actually call the callback function, using all defined channels, we must call the
GetCallbackTunnel method, passing the ChannelName as parameter. Using the resulting
TDSCallbackTunnel, we can call the BroadcaseMessage, as follows:
procedure TServerForm.btnCallbackAllClick(Sender: TObject);
var
CallbackTunnel: TDSCallbackTunnel;
JSONMsg: TJSONObject;
begin
//wrap the message in a JSON object
JSONMsg := TJSONObject.Create;
CallbackTunnel := DSServer1.GetCallbackTunnel(ChannelName);
if Assigned(CallbackTunnel) then
begin
JSONMsg.AddPair(TJSONPair.Create(ChannelName+'Tunnel',
edCallbackInfo.Text + ' at: ' + DateTimeToStr(Now)));
CallbackTunnel.BroadcastMessage(JSONMsg)
end
else // use ChannelName
begin
JSONMsg.AddPair(TJSONPair.Create(ChannelName,
edCallbackInfo.Text + ' at: ' + DateTimeToStr(Now)));
DSServer1.BroadcastMessage(ChannelName, JSONMsg)
end
end;
To do this for one specific client – for example the selected client in the TListBox, we need
to change two lines of code, passing the client (GUID) to the BroadcaseMessage calls, as
follows:
procedure TServerForm.btnCallClientClick(Sender: TObject);
var
CallbackTunnel: TDSCallbackTunnel;
JSONMsg: TJSONObject;
client: String;
begin
client := ListBox1.Items[ListBox1.ItemIndex];
// wrap the message in a JSON object
JSONMsg := TJSONObject.Create;
CallbackTunnel := DSServer1.GetCallbackTunnel(ChannelName);
if Assigned(CallbackTunnel) then
begin
JSONMsg.AddPair(TJSONPair.Create(ChannelName+'Tunnel',
edCallbackInfo.Text + ' at: ' + DateTimeToStr(Now)));
CallbackTunnel.BroadcastMessage(client, JSONMsg) // client = GUID
end
else // use ChannelName
begin
JSONMsg.AddPair(TJSONPair.Create(ChannelName,
edCallbackInfo.Text + ' at: ' + DateTimeToStr(Now)));
DSServer1.BroadcastMessage(ChannelName, client, JSONMsg) // client = GUID
end
end;
If we want to unregister the callback functions (for example before the server wants to
close), we need to call the UnregisterChannelCallback function, as follows:
procedure TServerForm.btnDisconnectCallbacksClick(Sender: TObject);
var
CallbackTunnel: TDSCallbackTunnel;
begin
CallbackTunnel := DSServer1.GetCallbackTunnel(ChannelName);
DSServer1.UnregisterChannelCallback(CallbackTunnel)
end;
Finally, if we do this at the FormClose, we need to add a little waiting time in order to give
the channels the chance to close gracefully:
procedure TServerForm.FormClose(Sender: TObject; var Action: TCloseAction);
begin
btnDisconnectCallbacksClick(Sender);
Sleep(1000); // give callback time to close...
DSServer1.Stop
end;
Now compile and run the CallbackServer, and then connect the TDSRestConnection
component on the CallbackClient and compile and run the CallbackClient.
We first need to run the callback server, followed by a number of callback clients. On the
callback server, we can click on the “Show Callback Clients” button, but nothing will show
up until at least of the callback clients has registered a callback.
As soon a callback client registers the callback, the memo shows the fact that the callback
channel was registered and the button is disabled:
We can do this in another callback client as well, with the same effect. Then, in the
callback server application, we can click on the “Show Callback Clients” button to display
two individual (and unique) callback clients.
If we enter some message in the CallbackInfo TEdit and click on the “All Clients callback”
button, then all callback clients will get a callback message in their TMemo control.
Note that we can further dissect the TJSONValue here instead of showing it in the TMemo.
We can also click on the “Client Callback” button on the callback server form, but only if
we select one of the callback clients in the listbox. Otherwise, you’ll get an error “List index
out of bounds (-1)”, which I leave to the reader as exercise to avoid.
Finally, a callback client can place a callback on the channel as well, using the broadcast
message. This was implement in the Alarm button’s OnClick event handler. Note that the
client itself will also receive this message:
This technique can be used to signal important changes in a database or other system
where clients need to be notified of a change.
With this framework in mind, we can extend the actual callback (the anonymous method)
with more powerful examples. As an example, we can pass the name of a table that just
received updates, so all clients will know that they may have to refresh the contents of a
certain TClientDataSet in order to allow users to see the latest data.
DBX Callbacks
In order for callbacks to work over a DBX connection, we have to make a few changes to
the callback client. It may be interesting enough to make a copy of the client project, so
we can mix DBX and REST clients at the same time. I’ve created a copy of the
CallbackClient project in CallbackClientDBX with the main form in ClientFormDBX.pas.
On the DBX client form, we need to replace the TDSRestConnection component by a
TSQLConnection component, again connecting it to the DataSnap channel server.
Remove the ChannelManager field from the form, and also clear the implementation of the
button OnClick event handlers, as well as the FormClose event handler. The only thing we
can re-use is the FormCreate, where we obtain a unique CallbackID.
Using DBX, we need to configure the TDSClientCallbackChannelManager component first.
This component has properties CommunicationProtocol and DSPort that we need to set (in
our case to resp. http and the port number of the DataSnap Callback Server. We can also
set the DSHostname property, or leave if empty to connect to localhost only.
Then, in the FormCreate, we should assign a value to the ManagerID (which is the unique
CallbackID, or the GUID) and the ChannelName, which we stored in the ChannelID unit.
This can be implemented as follows:
procedure TFormClient.FormCreate(Sender: TObject);
begin
DSClientCallbackChannelManager1.ChannelName := ChannelID.ChannelName;
DSClientCallbackChannelManager1.ManagerId := CreateClassID;
CallbackID.Caption := DSClientCallbackChannelManager1.ManagerId
end;
Registering a callback using DBX means registering a class derived from the abstract base
class TDBXCallback. We need to derive from TDBXCallback and override the Execute
method. The definition of our TCallbackClient can be as follows, storing the ChannelName
and CallbackName (the unique name), as well as overriding the constructor and the
abstract Execute method:
type
TCallbackClient = class(TDBXCallback)
private
FChannelName: string;
FCallbackName: string;
public
constructor Create(const AChannelName, ACallbackName: string);
function Execute(const Arg: TJSONValue): TJSONValue; override;
end;
And with this type at hand, we can now register a DBX callback using the
TDSClientCallbackChannelManager, as follows:
procedure TFormClient.btnRegisterCallbackClick(Sender: TObject);
var
Callback: TCallbackClient;
begin
Callback := TCallbackClient.Create(ChannelName, CallbackID.Caption);
DSClientCallbackChannelManager1.RegisterCallback(CallbackID.Caption, Callback);
end;
The complete code in the ChannelID unit, showing both overloaded ChannelBroadCast
methods, is now as follows:
unit ChannelID;
interface
uses
SysUtils, DSClientRest, DBXJSON, DSHTTPCommon;
const
ChannelName = 'MyCallbackChannelName';
implementation
finally
LMessage.Free;
end;
end;
end;
end.
Which completes the implementation of the DBX callback client. Note that we can now run
the same DataSnap callback server against both the REST callback clients and the DBX
callback clients.
The callbacks from the DataSnap server arrive in both the REST and DBX clients, and an
“ALARM” by one of the clients will arrive at all clients, both REST and DBX. So while the
implementation of the callback clients is different depending on the use of REST or DBX,
the behavior is the same.
There are few differences: the DBX callbacks can be made over HTTP, HTTPS as well as
TCP/IP, while the REST callbacks can only be made over HTTP or HTTPS.
Also remember that the DBX transport can use filters to encrypt or compress the
messages, which may be required depending on the circumstances (although REST can
use HTTPS for a secure connection as well).
We’ve also seen how DataSnap supports callbacks. Both the light-weight callbacks that
were introduced by Delphi 2010 and the “heavy” callbacks in DataSnap XE. Not just for
REST clients, but also for DBX clients, and in both cases with identical code for the
DataSnap callback server.
Data Access
There are a number of data access technologies available for FireMonkey applications (in
short: all, except for the BDE). The main difference is the fact that FireMonkey does not
have the concept of a data-aware control, so the data must be “bound” in some other way,
using LiveBindings.
The Data Access itself can remain unchanged (although you may have to change some
connection properties based on the target platform). The Data Access (3) category
contains the TDataSource, TClientDataSet and TDataSetProvider controls. Although there
are no data-aware controls in FireMonkey, the TDataSource can still play a role when
defining master details relationships between datasets, for example.
dbExpress
The dbExpress category (8) contains the TSQLConnection, TSQLDataSet, TSQLQuery,
TSQLStoredProc, TSQLTable, TsqlServerMethod, TSQLMonitor and the TSimpleDataSet
control (I wouldn’t recommend using the latter, and instead always use the TSQLDataSet –
TDataSetProvider – TClientDataSet chain).
dbGo
The dbGo category (7) contains the TADOConnection, TADOCommand, TADODataSet,
TADOTable, TADOQuery, TADOStoredProc and TRDSConnection controls. Note that there
are ODBC drivers available for Windows and Mac OS X (and also for Linux, for future
targets of the Delphi compiler).
InterBase
The InterBase category (16) contains the TIBTable, TIBQuery, TIBStoredProc,
TIBDatabase, TIBTransaction, TIBUpdateSQL, TIBDataSet, TIBSQL, TIBDatabaseInfo,
TIBSQLMonitor, TIBEvents, TIBExtract, TIBConnectionBroker, TIBScript, TIBSQLParser,
and TIBDatabaseINI controls.
The InterBase Admin category (8) contains the TIBConfigService, TIBBackupService,
TIBRestoreService, TIBValidationService, TIBStatisticalService, TIBLogService,
TIBSecurityService, and TIBServerProperties controls.
DataSnap Client
The DataSnap Client category contains the TDSProviderConnection control, as well as the
TConnectionBroker (but that one should be used with the old DataSnap technology):
For a LiveBindings example, and especially for the one that shows the biolife data in a
TStringGrid, we need to start with a TClientDataSet and a TDataSource.
Data-binding
The biolife example data can be found in the C:\Users\Public\Documents\RAD
Studio\9.0\Samples\Data directory as biolife.xml. However, after assigning the FileName
property of the TClientDataSet, we should open the TClientDataSet (by setting the Active
property to True), and then clear the FileName property again. Obviously, the Windows
fully qualified path will be invalid when run on the Mac. However, with the TClientDataSet
open, the data will be stored in the form file (which is .fmx for FireMonkey by the way,
where VCL uses .dfm). Make sure to connect the TDataSource to the TClientDataSet, in
the usual way, and then place a TStringGrid on the FireMonkey form. The display will be a
bit different than usual, with the black border, and the TStringGrid itself will also look a bit
different at design-time.
The TStringGrid has an Align property that works in FireMonkey as well. We get a bit more
choices however, as discussed before. For this demo, I’ve selected alFitLeft.
Link to DB DataSource
Next, we can open up the LiveBindings property of the TStringGrid, and notice that apart
from a choice “New LiveBinding…”, there is now also a choice “Link to DB DataSource…”. If
we pick the latter, we get a dialog where we can select the datasource (DataSource1). We
can even “open up” the DataSource, to see which fields are exposed and made available
by selecting that particular DataSource.
If you click on OK now, then two new components will be created and placed on the FMX
form: a TBindScopeDB and a TBindingsList. Also, the TStringGrid is filled with the “live”
data from the TClientDataSet:
Where did the magic come from? If you double-click on the BindingsList, you’ll see a DB
links category, and the DBLinkStringGrid11 that links the grid control StringGrid1 to the
source BindScopeDB1 (which was also autocreated):
If you double-click on the DBLinkStringGrid11, you can see all the individual Columns and
Expressions that were generated automatically.
Although also available for VCL, LiveBindings are important to FireMonkey. If we add a
new LiveBinding in a FireMonkey application, the list of available choices and includes a
FireMonkey-only category DB Links with TBindDBEditLink, TBindDBTextLink,
TBindDBListLink, TBindDBImageLink, TBindDBMemoLink, TBindDBCheckLink, and
TBindDBGridLink (the latter was used in our recent example).
Obviously, the DB Links are less important for VCL applications since we already have
data-aware controls in the VCL.
The next screenshot shows the Biolife table in a FireMonkey form running on Mac OS X.
Note that I’ve added a few more controls. A TBindNavigator with the BindScope set to
BindScopeDB1 and the Align property set to alTop. The Align setting will overlap with the
TStringGrid with the Align set to alFitLeft (one of the effects of alFitLeft). We have to
change the Align of the TStringGrid to alLeft to correct that.
Also, to show the image, I’ve added a TImageControl, with Align set to alClient, and used
the LiveBinding property to create a New DB Link to the Graphic field.
Libmidas.dylib Deployment
BTW, running the application on the Mac OS X would result in an error regarding a missing
libmidas.dylib image, which means we have to add the libmidas.dylib from the C:\Program
Files\Embarcadero\RAD Studio\9.0\binosx32 directory to the list of Deployment files:
DataSnap Client
For a dbExpress, dbGo for ADO or InterBase example that runs on Mac OS X, you may
need a database that is deployed on the Mac. This is not a problem when building a
DataSnap Client, since that will be a thin (also called “smart”) client, without the need for
database drivers or connectivity. The DataSnap client will connect to the DataSnap server,
and use the DataSnap mechanism to transport the data – even if the client is running on a
Mac OS X machine.
We'll use the existing DataSnapServerRESTISAPI.dll, deployed on my web server (see
previous sections) to access using a FireMonkey DataSnap Client.
Do File | New - Other, and select a FireMonkey HD Application from the Object Repository
to start our FireMonkey DataSnap Client:
Once you have an open connection, you can right-click on the SQLConnection component
and select the “Generate Client Classes” option:
This will produce a unit, initially called Unit1, with a proxy generated for the DataSnap
server. I always save it in DBXClientClasses.pas.
The proxy unit contains the definition for class TServerMethods2Client, with our five
server methods: EchoString, ReverseString, ServerTime, Add33 and GetEmployees.
type
TServerMethods2Client = class(TDSAdminClient)
private
FEchoStringCommand: TDBXCommand;
FReverseStringCommand: TDBXCommand;
FEmployeesCommand: TDBXCommand;
public
constructor Create(ADBXConnection: TDBXConnection); overload;
constructor Create(ADBXConnection: TDBXConnection; AInstanceOwner: Boolean); overload;
destructor Destroy; override;
function EchoString(Value: string): string;
function ReverseString(Value: string): string;
function ServerTime: TDateTime;
function Add33(X1: Integer; ... ... ... X33: Integer): Integer;
function GetEmployees: TDataSet;
end;
Note the two constructors: the first one is equivalent to calling the second one with “False”
passed as value for the AInstanceOwner parameter (so by default, the DataSnap server
does not own the instance memory).
The function GetEmployees returns a read-only TDataSet. However, it is not only a read-
only, but also a unidirectional dataset, so we need some additional components to “cache”
the contents.
SqlServerMethod
Place a TsqlServerMethod next to the TSQLConnection component. Connect the
SQLConnection property of the TSqlServerMethod to the TSQLConnection component.
Then, open up the drop-down list for the ServerMethodName property, and select the
TMyServerMethods.GetEmployees server method.
Now, the TSqlServerMethod will be the unidirectional, read-only dataset, and we need two
other components to actually show the contents. Place a TDataSetProvider and a
TClientDataSet on the form. Set the DataSet property of the TDataSetProvider to the
TSqlServerMethod, and the ProviderName property of the TClientDataSet to the name of
the TDataSetProvider. The TClientDataSet will hold the cached records from the Employee
table, and can be used to display them in a TStringGrid for example.
DataSnap LiveBindings
Next, we should place a TDataSource component, with the DataSet property pointed to the
TClientDataSet. And finally a TStringGrid control. with the Align property set to alClient:
Using the LiveBindings property of the TStringGrid, we can select the “Link to DB
DataSource” option, which gives a dialog where we can select the DataSource component:
As soon as we select the DataSource1 and click on OK, two additional components will be
placed on the form: a TBindingList and a TBindScopeDB component. The former maintains
the list of LiveBindings, the latter is specifically for the DB LiveBinding.
Then, if we activate the TClientDataSet, the TStringGrid will show the data from the
Employee table, and the FireMonkey application is now a DataSnap client:
Apart from adding the MidasLib and DBXCompressionFilter units to the uses clause of the
DataSnap Client project, we must also add the DataSnap.DSHTTPLayer in order to register
the http/https support.
You can compile and run the FireMonkey client as a Win32 or Win64 executable:
Mac OS X Target
To produce a Mac OS X client, we need to add a new target platform for OS X in the
project manager. This also requires a Remote Profile to a Mac running OS X, where the
Platform Assistant Server is installed (and running in case you want to actually deploy or
run the resulting FireMonkey DataSnap client application on the Mac).
In order to deploy and run the DataSnap Client on Mac OS X, we need to start the
Deployment page using Project | Deployment, and then click on the toolbat button to add
Feature Files. Add the MIDAS Library (at least for OSX32):
We need to remove the MidasLib unit from the uses clause to deploy to the Mac (or place
it in IFDEFs for Win32 and Win64).
Now you can compile and run it as an OS X target, with the following result:
Summary
In this section, I’ve covered FireBindings for FireMonkey, and specifically for Mac OS X
applications (as well as data binding to datasets).
This will ensure that a subdirectory called "proxy" is generated in your project directory,
and whenever you request the proxy as a developer, by appending the
/proxy/freepascal_ios42.zip or /proxy/freepascal_ios50.zip you will get the two proxy .zip
files with the Delphi / FreePascal units that we need for the FireMonkey client.
https://www.bobswart.nl/DataSnapXE2/DataSnapServerRESTISAPI.dll/proxy/freepascal_io
s50.zip is the URL for the iOS 5.0 version of the proxy code. The units in this
freepascal_ios50.zip can be added to a new FireMonkey for iOS project to produce a
FireMonkey for iOS DataSnap Client, as we'll do in the remainder of this section.
Note that if you've created your own DataSnap REST server, you may need to make two
changes to the generated source code in the proxy archive in order to allow Xcode to
compile these units. The problem is caused by the fact that the unit names and file names
are not always the same, and while Windows filenames are not case sensitive, the Mac is.
In order to avoid problems, you need to rename the unit with filename
DSRESTParameter.pas to file name DSRestParameter.pas, and the unit with filename
DSRESTTypes.pas to DSRestTypes.pas. That will ensure that the compiler will try to locate
the correct source files and actually find them.
Tip: if you want to ensure that the correct filenames are always used for new DataSnap
REST Server applications as well, you may want to go to the Object Repository location
which is C:\Program Files (x86)\Embarcadero\RAD Studio\9.0\ObjRepos and in there find
the \en\dsrest\connectors\freepascal_ios50 and \en\dsrest\connectors\freepascal_ios42
subdirectories for the iOS 5.0 and 4.2 proxy files that will be used. Rename the
DSRESTParameter.pas and DSRESTTypes.pas files (change REST to Rest in the filename)
and all Delphi XE2 DataSnap REST Servers with the Mobile Connectors enabled will use the
correct filenames from now on.
DataSnap Client
Start Delphi XE2, do File | New - Other to see the different FireMonkey for iOS targets in
the Object Repository. We can create a FireMonkey 3D iOS Application, or a FireMonkey
HD iOS Application. The former can be used for 3D graphic manipulation applications, but
right now, we want a "normal" business application, so no 3D is needed.
Create a new FireMonkey HD iOS Application, and save the project as DataSnapClient,
with the form in ClientForm.pas.
At this point, we need to unzip the freepascal_ios50.zip file and put the units from this
archive in the same directory that hold the DataSnapClient project files.
Note that the list in the above screenshot still contained the units with original filenames
DSRESTParameter.pas and DSRESTTypes.pas (it is recommended to rename them to
DSRestParameters.pas and DSRestTypes.pas).
With these units added to the project, we can add the units DSRESTConnection (for the
TDSRestConnection class), DSProxy (for our TMyServerModule class) and DB to the uses
clause.
Now, add a listbox to display the records from the Employees table, and a TButton to
connect and fill the listbox with the actual data. The implementation of the OnClick event
handler can be as follows:
procedure TDataSnapClientForm.btnConnectClick(Sender: TObject);
var
Server: TDSRestConnection;
Proxy: TMyServerMethods; // name at server
DataSet: TDataSet;
begin
Server := TDSRestConnection.Create(nil);
try
Server.Protocol := 'https';
Server.Host := 'www.bobswart.nl';
Server.Port := 443;
Server.UrlPath := '/DataSnapXE2/DataSnapServerRESTISAPI.dll';
// Server.UserName := 'Bob';
// Server.Password := 'Swart';
Proxy := TMyServerMethods.Create(Server);
if Assigned(Proxy) then
try
ListBox1.Items.Add(DateTimeToStr(Proxy.ServerTime));
ListBox1.Items.Add('---');
DataSet := Proxy.GetEmployees;
Note that we have to change the SQL at the server side, to return the EMP_NO,
FIRST_NAME, LAST_NAME, DEPT_NO, JOB_COUNTRY field but not the HIRE_DATE field,
since that particular fieldtype (TSQLDateTimeStamp) is not compatible and/or does not
(yet) work with the DataSnap Mobile Connectors.
Also, we must make sure that the TDSServerClass has its LifeCycle property set to Session
and not Invocation, otherwise the Server Methods unit will be destroyed before the dataset
can be returned.
Export to Xcode
The project will compile in Delphi XE2, but you cannot run it (this will result in a rather
odd-looking Exception):
Using the DPR2XCODE tool, we can create the necessary xcode project files and icons, and
then use Xcode to open the DataSnapClient project on a Mac. For this, we need to copy
the entire project directory, including the xcode subdirectory to a disk or drive where you
can access it using a Mac running OS X and Xcode. I'm using a shared drive between
Windows (with Delphi XE2) and the Mac running OS X and Xcode myself, so I can just
double-click on the .xcodeproj file to open the project. See also the information at
http://docwiki.embarcadero.com/RADStudio/en/FireMonkey_Development_Setup_for_iOS
Regardless of how you will make the project files available on the Mac, you can double-
click on the DataSnapClient.xcodeproj file to open the project in Xcode, with the following
result:
Note that the screenshot above shows Xcode 4.2 running on Mac OS X 10.6, but I'm still
able to produce apps for iOS 5.1 devices. You may run Xcode 4.3 yourself, which requires
Mac OS X 10.7 to produce the iOS 5.1 output.
See my Delphi XE2 native iOS Development courseware manual, or my webblog at
http://www.bobswart.nl/Weblog/Blog.aspx?RootId=5:5267 for more details.
If everything is working according to plan, then the result in the iPhone Simulator will look
as follows:
Another example of an iOS DataSnap Client using the Mobile Connectors will be given in
the next edition of this manual at the end of the last section, where we'll create a DIRT
Client for iOS.
Summary
In this section, I've demonstrated how we can create DataSnap Mobile Connectors proxy
files (and need to make two minor changes in the generated filenames), and use these to
create FireMonkey for iOS DataSnap Client applications.
Start Prism XE2, and do View | Server Explorer to view the Prism Server Explorer. We
should first make a connection here, to verify that we can actually work with the DataSnap
Server.
The Server Explorer is a treeview with a root node called Data Connections. Right-click on
Data Connections and select Add Connection. In the dialog that follows, select DataSnap
from the list of data sources (note: you need to click on Change if a datasource is already
preselected).
You may want to uncheck the “Always use this selection” checkbox, unless you always
want to build only DataSnap data connections, of course.
Click on Continue to get to the next page of the dialog. Here, we can specify the details to
connect to the DataSnap Server. Note that you may encounter a situation where the Add
Connection dialog only shows the default fields (which are useless for DataSnap), as
follows:
In that case, it's likely that your .NET configuration file has been corrupted, and must be
fixed. Go to the C:\Windows\Microsoft.NET\Framework\v2.0.50727\CONFIG and
C:\Windows\Microsoft.NET\Framework\v4.0.30319\config directories, and look for the
machine.config file.
I've seen situations where the machine.config file had a duplicate entry for
DbProviderFactories
<system.data>
<DbProviderFactories>
<add name="Microsoft SQL Server Compact Data Provider" ... />
<add name="Datasnap AdoDbx Data Provider" ... />
<add name="Interbase AdoDbx Data Provider" ... />
</DbProviderFactories>
<DbProviderFactories>
<add name="BlackfishSQL Local Provider" ... />
<add name="BlackfishSQL Remote Provider" ... />
<add name="AdoDbx Data Provider" ... />
</DbProviderFactories>
</system.data>
But you can also encounter an empty, self-closing <DbProviderFactories/> tag after one
with contents. Since the second occurrence will override the first one, this will lead to
errors about a missing (or unknown) data provider.
Don't forget to check both the Framework and Framework64 directories, which means you
need to check (and possibly fix) no less than four machine.config file. And also make sure
to make a backup copy just in case, and be very careful with these changes. You may
need to reboot for these changes to have effect by the way.
On a machine with a correct(ed) machine.config file, the dialog will look different (and
correct) as follows:
In the Protocol drop-down combobox, we can select tcp/ip or http. For the DataSnap
server deployed on my web server, we should specify http (and ensure to login as user
Bob, otherwise access will be denied). Note that we cannot specify https as connection
protocol here, since the HTTPS protocol is only available after an adequate instance of the
TDBXCommunicationLayer has been registered (by the Visual Studio IDE), which is not the
case. So for now, we can only use HTTP or TCP/IP.
Next, we should specify the Host i.e. the machine name where the DataSnap Server is
running – this can be localhost if you are testing on the same local machine. For the
DataSnap Server on my web server, this should be www.bobswart.nl
Then you need to specify the Port number. By default this will be Port 80 for HTTP and
Post 211 for TCP/IP, but if you’ve read this courseware manual you will know that both
values could (or should) be different – at least make sure to specify the same value here
that you specified in the transport component(s) on the ServerContainer unit. Or, when
deployed using DataSnap WebBroker, the HTTP(S) port used for the server. Using IIS, we
can specify port 80 for HTTP or port 443 for HTTPS. Since there is no choice or support for
HTTPS here, we have to resort to port 80 instead.
The next property contains the Path. This is only important if you want to connect to a
Web Broker based DataSnap Server (where you need to specify the URLPath to get to the
DataSnap Web Server – the part after the http://..../ domain part, that is). For the current
example, the path should be assigned to /DataSnapXE2/DataSnapServerRESTISAPI.dll
Finally, don’t forget to specify the Authentication User name and Password, in case the
DataSnap Server is using Authentication. For the current example, we must specify User
name Bob and Password Swart in order to be allowed to connect to the DataSnap Server
on my web site.
Click on the Test Connection button to verify that a connection can be made to the
specified DataSnap Server. This will give you a “Test connection succeeded” dialog if
everything was specified correctly. You may get an error if the check for a HTTPS
connection is still present (in case you're testing your own DataSnap REST ISAPI Server
for example).
When you click on OK, a new entry for the DataSnap connection will be added to the Data
Connections tree. In this case, it’s for a www.bobswart.nl node.
If you expand the new node, you’ll find subnodes for Tables, Views and Stored Procedures.
The Tables and Views are empty, but the Stored Procedures will contain all exposed Server
Methods from the DataSnap Server. Including our custom server methods EchoString,
REverseString, GetEmployees and ServerTime.
We can now test some of the server methods from the Server Explorer. For example,
right-click on the EchoString method and select View Parameters. This will give you a new
window where you can enter a value for the Value parameter. Let’s enter 42 as value for
the Value parameter. Now, right-click in the window and select “Execute”. This will execute
the EchoString method from the DataSnap Server, showing the result just below the
stored procedure parameters window:
While this is nice, it’s probably a bit more instructive to see how we can retrieve and use
the data from the Employees table, by using the GetEmployees method. This Stored
Procedure has no parameters, but we can still select the “View Parameters” command,
which just gives an empty list of stored procedure parameters. Again, right-click on this
window and select “Execute”. This time, the result is the complete set of records from the
Employees table, as returned by the GetEmployees method:
WinForms Client
Although working with DataSnap Server methods in the Server Explorer can be fun, it’s
more useful to call them from a .NET application. For the next example, do File | New
Project to start the New Project wizard in Prism (Oxygene for .NET). This will give you an
overview of the available targets.
From the Windows Project Type, select the Windows Application and change the Name
from WindowsApplication1 to DataSnapClient.
If you click on OK, a new project DataSnapClient will be created in the Prism IDE, with a
Main.pas unit for the Main Form.
From the Server Explorer, select the new connection to the DataSnap Server we created in
the previous section. The Properties Explorer will display the properties, including the
ConnectionString, which should be something like the following:
communicationprotocol=http;hostname=www.bobswart.nl;port=80;
urlpath=/DataSnapXE2/DataSnapServerRESTISAPI.dll;dsauthenticationuser=Bob;
dsauthenticationpassword=Swart
Right-click on the data connection node in the Server Explorer, and select “Generate Client
Proxy” option.
This will generate a file called ClientProxy1.pas to the project, with the definition of a class
called TMyServerMethodsClient with a number of method, including EchoString,
ReverseString, ServerTime, and GetEmployees.
A number of References will also be added to the project, as can be seen in the Solution
Explorer:
type
TMyServerMethodsClient = class
private
FDBXConnection: TAdoDbxConnection;
FInstanceOwner: Boolean;
FDSServerModuleCreateCommand: TAdoDbxCommand;
FDSServerModuleDestroyCommand: TAdoDbxCommand;
FEchoStringCommand: TAdoDbxCommand;
FReverseStringCommand: TAdoDbxCommand;
FServerTimeCommand: TAdoDbxCommand;
FAdd33Command: TAdoDbxCommand;
FGetEmployeesCommand: TAdoDbxCommand;
FAS_GetProviderNamesCommand: TAdoDbxCommand;
FAS_ExecuteCommand: TAdoDbxCommand;
protected
procedure Dispose(ADisposing: Boolean); virtual;
public
constructor (ADBXConnection: TAdoDbxConnection);
constructor (ADBXConnection: TAdoDbxConnection; AInstanceOwner: Boolean);
procedure Dispose; virtual;
function EchoString(Value: string): string;
function ReverseString(Value: string): string;
function ServerTime: DateTime;
function Add33(X1: Integer; X2: Integer; X3: Integer; X4: Integer; X5: Integer;
X6: Integer; X7: Integer; X8: Integer; X9: Integer; X10: Integer;
X11: Integer; X12: Integer; X13: Integer; X14: Integer; X15: Integer;
X16: Integer; X17: Integer; X18: Integer; X19: Integer; X20: Integer;
X21: Integer; X22: Integer; X23: Integer; X24: Integer; X25: Integer;
X26: Integer; X27: Integer; X28: Integer; X29: Integer; X30: Integer;
X31: Integer; X32: Integer; X33: Integer): Integer;
function GetEmployees: System.Data.IDataReader;
function AS_GetProviderNames: string;
procedure AS_Execute(ProviderName: String; CommandText: String;
var ParamReader: System.IO.Stream; var OwnerDataStream: System.IO.Stream);
end;
Apart from the proxy class, there are also a number of references added to the References
node of the project: Borland.Data.AdoDbxClient and Borland.Data.DbxClientDriver to be
precise.
As you can see from the code snippet of the TMyServerMethodsClient, this class has two
constructors: both with a ADBXConnection parameter, and the second one also with an
AInstanceOwner Boolean parameter. This means we need to call the constructor with an
argument. And in order to support that, we have to make a modification to the project
settings.
Right-click on the DataSnapClient node in the Solution Explorer, and select Properties. In
the window that is now shown, click on the Compatibility tab and then check the option
“Allow Create constructor calls”, which will allow us to call the .Create constructor, passing
arguments, instead of using the new operator.
Now, we can return to the Main Form, and place a Button on it. In the Click event of the
button, we can create a connection to the DataSnap Server and call one of its methods
(you need to add .
method MainForm.button1_Click(sender: System.Object; e: System.EventArgs);
var
Client: ClientProxy1.TMyServerMethodsClient;
Connection: Borland.Data.TAdoDbxDatasnapConnection;
begin
Connection := new Borland.Data.TAdoDbxDatasnapConnection();
Connection.ConnectionString :=
'communicationprotocol=http;hostname=www.bobswart.nl;port=80;' +
'urlpath=/DataSnapXE2/DataSnapServerRESTISAPI.dll;' +
'dsauthenticationuser=Bob;dsauthenticationpassword=Swart';
Connection.Open;
try
Client := ClientProxy1.TMyServerMethodsClient.Create(Connection);
MessageBox.Show(
Client.EchoString('Prism XE2'));
finally
Connection.Close;
end;
end;
The result should be the echo of Prism, but is in fact an error message about the assembly
Borland.Data.DbxCommonDriver or one of its dependencies that could not be loaded
because it could not be found:
Alternately, you may get an error: "Unknown driver: Datasnap", which will be shown with
the latest edition of Prism XE2:
Strangely enough, this error only appears if we created a .NET Framework 4.0 client. If we
created a client for a different version of the .NET Framework, the DataSnap client works
just fine.
In order to change the version of the .NET Framework that the DataSnapClient project is
targeting, we can right-click on the project node again, and select Properties. In the
Application tab, we can set the Target Framework. If we change it from .NET Framework 4
to .NET Framework 3.5 for example, the project will work fine:
This time, when we click on the button, the result is the echo of Prism XE2.
Unfortunately, the same problem occurred with Delphi Prism XE and DataSnap Servers, so
it looks like the clients are limited to .NET Framework 3.5 or lower at this time.
DataGridView
In a similar way, we can call the GetEmployees method and assign the result to a
DataGridView. This poses us with a little problem, since the GetEmployees will return an
IDataReader (the equivalent of a TSQLDataSet’s result), and not a DataSet or a
DataTable. We have to write a few lines of code to load the result of GetEmployees into a
new DataTable inside a DataSet (the equivalent of the TClientDataSet at the Win32 side).
method MainForm.button1_Click(sender: System.Object; e: System.EventArgs);
var
Client: ClientProxy1.TMyServerMethodsClient;
Connection: Borland.Data.TAdoDbxDatasnapConnection;
Employees: System.Data.IDataReader;
ds: System.Data.DataSet;
dt: System.Data.DataTable;
begin
Connection := new Borland.Data.TAdoDbxDatasnapConnection();
Connection.ConnectionString :=
'communicationprotocol=http;hostname=www.bobswart.nl;port=80;' +
'urlpath=/DataSnapXE2/DataSnapServerRESTISAPI.dll;' +
'dsauthenticationuser=Bob;dsauthenticationpassword=Swart';
Connection.Open;
try
Client := ClientProxy1.TMyServerMethodsClient.Create(Connection);
Employees := Client.GetEmployees;
ds := new DataSet();
dt := new DataTable("DataSnap");
ds.Tables.Add(dt);
ds.Load(Employees, LoadOption.PreserveChanges, ds.Tables[0]);
dataGridView1.DataSource := ds.Tables[0];
MessageBox.Show(
Client.EchoString('Prism XE2'));
finally
Connection.Close;
end;
end;
The result is the following data shown in the DataGridView of a Prism WinForms
application, which demonstrates that we can write thin .NET clients to connect to
DataSnap Servers.
ASP.NET Client
Let’s now build a second DataSnap Client, this time an ASP.NET web application. We can
add the new project to the same solution if you want. Do File | New Project again, and this
time select the Web category of Prism projects, which shows the ASP.NET Web Application
item. Remember the issue we had with the .NET Framework 4 version (I will present a
workaround at the end of this section), so select .NET Framework 3.5 for now. Call the
new project DataSnapWebClient, and make sure to add it to the current solution:
This will produce a DataSnapWebClient ASP.NET project with a Default.aspx page where
we can place ASP.NET controls.
But first we need to go to the Server Explorer, right-click on the node for the DataSnap
server, and select “Generate Client Proxy” again to add the ClientProxy1.pas unit (as well
as the required references) to the DatasnapWebClient project.
Then, we can place a button on the designer, double-click on it, and write the following
code, this time to call the ReverseString server method:
The application itself is an issue monitoring and tracking system, called D.I.R.T. which
stands for Development Issue Report Tool. The system is already being used by
developers (in different locations) to enter issue reports for projects of some of my
customers, and this section is a modified edition from the XE version of D.I.R.T.
Finally, new compared to the first (XE) version of DIRT, is the Watch table, where we can
specify reports that are "watched" (read-only) by users. This table is based on a wish
which was born after a period of real-world usage of the initial DIRT system.
User
The User table contains the list of developers as well as testers and other users that need
to access the application. Required information are a username and (hashed) password, as
well as the e-mail address. Other fields, like address or phone number, can be added later
when necessary. In our example, a user can have a role, like “developer”, “tester” or
“manager”. The system will use these role names plus a special “admin” role. This
technique can be extended by using commas to separate multiple roles for a user when
needed.
CREATE TABLE [dbo].[User](
[UserID] [int] IDENTITY(1,1) NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[PasswordHASH] [nvarchar](50) NOT NULL,
[Role] [nvarchar](32) NOT NULL,
[Email] [nvarchar](50) NOT NULL
)
We should not store the real password of the user, but only the HASH value of the
password. This also means that only the hashed version of the password is sent over the
(secure) connection from the client to the server part of the application. We’ll see how to
generate the hash shortly, and how to ensure that the connection is secure.
Report
The Report table is the most detailed dataset. Here, we should be able to enter a new
report (and brief summary of 140 characters – ready to be tweeted as well), describing
the Project, Version (optional), Module (optional), Type of issue as well as Priority. A
special field is used to contain the Status of the issue (like reported, assigned, opened,
solved, tested, deployed, closed). Also important are the date of the report, the user who
reported it, and the user who is assigned to the issue, although the latter is optional, since
it may not be known right from the start who should solve the issue.
CREATE TABLE [dbo].[Report](
[ReportID] [int] IDENTITY(1,1) NOT NULL,
[Project] [nvarchar](50) NOT NULL,
[Version] [nvarchar](20) NULL,
[Module] [nvarchar](50) NULL,
[IssueType] [int] NOT NULL,
[Priority] [int] NOT NULL,
[Status] [int] NOT NULL,
[ReportDate] [datetime] NOT NULL,
[LastUpdate] [datetime] NOT NULL,
[ReporterID] [int] NOT NULL,
[AssignedTo] [int] NULL,
[Summary] [nvarchar](140) NOT NULL,
[AttachedFileName] [nvarchar](100) NULL,
[Report] [nvarchar](4000) NOT NULL
)
Compared to the XE edition of DIRT, the LastUpdate and AttachedFileName fields are new.
LastUpdate is the date when the contents of report was last updated (or the date of the
last comment for example), while the AttachedFileName is an optional field that points to a
file with a screenshot or archive with more information.
Note that the ReportedID and AssignedTo fields are actually foreign keys to the User table
(as depicted in the diagram of the data module). Also note that the IssueType, Priority and
Status fields are integers. The actual meaning of these values, which can for example
range from 1 to 10, will have to be defined elsewhere. We could have used additional look-
up tables to store the meaning as strings, but we’ve decided to let the GUI worry about
the representation, and just use the integer values in our system.
We only have to agree on one rule: the IssueType and Priority range from 1 to 10, and 1
is lower than 10, so a Type of 10 and Priority of 10 is more urgent than a Type 1 issue
with Priority 1.
For the Status, we’ve defined an initial list of potential status values that an issue can
have, including the actors that can assign a certain status value:
Compared to the XE edition of DIRT, the status 4, 5, and 9 were added. Although we can
add a look-up table with the translation of the status values and the string representation,
we’ve again decided to let the GUI handle that (as well as the translation of the status
strings).
Note that the Reporter is the initial user who reports an issue and assigns it to a
deverloper. Optionally, a developer could assign an issue as well. The developer will then
be able to assign the status In Progress, Testing (by the developer), as well as Solved,
after which the Reporter can check the solution. The next step is either an assignment to
Testen (so it can be deployed) or an assignment back to Assigned, if an issue was not
solved.
Comment
The Comment table is the one that will actually contain the workflow or logbook of what’s
being reported and done about the issue. This means that for each issue report, we can
have a user placing a comment (at a certain date, to see the actual progress).
New compared to the XE edition of DIRT is the AttachedFileName field, which optionally
contains the filename of a screenshot or other document with more details relevant to the
comment. This field was also added after we've used D.I.R.T. for some real projects for a
while, and felt the need to be able to add screenshots to issue comments.
Watch
Apart from the additional fields in the Report and Comment table, the real-world usage of
the XE edition of DIRT resulted in the actual addition of a whole new table: the Watch
table with a UserID and ReportID field to assign reports to people (or people to reports) as
"watchers", so these developers would be aware of a certain issue and the progress, but
didn't really need to be assigned to that particular issue.
These four tables, linked together by the different ReportID, ReporterID, AssignedTo and
UserID fields, are all we need to build the DataSnap Server application for the D.I.R.T.
project.
The source code archive of this Delphi XE2 DataSnap Development courseware manual
contains the SQL Server 2008 database scripts to create the database as well as the four
tables inside. It should not be too difficult to use a different database for the D.I.R.T.
server, provided that the database tables have a similar layout and structure.
DataSnap Server
Delphi XE2 Enterprise supports no less than three new Wizards in the Object Repository to
help us create DataSnap Servers.
For the D.I.R.T. application, we need a DataSnap Server type that can be accessed from
all over the world in a secure and safe way (some of the issue reports may contain internal
details that we do not want to expose to everybody). To allow access from different mobile
devices and wireless connections, the use of only the HTTPS (HTTP Secure) protocol was
chosen, using an SSL/TLS certificate to provide an encrypted communication and secure
identification of the server. HTTPS is more secure than HTTP, which is why we picked
HTTPS. Also, while TCP/IP has proven to be faster than HTTP and HTTPS, speed is not a
concern for this application. If you need more raw speed, then a choice for TCP/IP as
communication protocol may be better.
The choice for HTTPS means that all DataSnap Server project target that the wizards can
produce can be used: an ISAPI DLL produced by the DataSnap WebBroker or DataSnap
REST Application wizards (hosted in IIS with support for HTTPS), or a DataSnap Server
with HTTPS as supported by an external certificate.
The use of IIS was preferred over the use of stand-alone certificates, so an ISAPI DLL
target was chosen as best option. The difference between the DataSnap WebBroker and
DataSnap REST application is the generation of additional files (JavaScript, templates,
style sheets and images) for the REST application. These additional files are not needed at
this time, so the choice was made to use the DataSnap WebBroker Application wizard,
producing an ISAPI dynamic link library. This ISAPI DLL will be deployed on Microsoft
Internet Information Services (IIS), as discussed later in this section.
In the second page of the DataSnap WebBroker Application wizard, we can specify if we
want to use Authentication and Authorization, if we want a Served Methods Class with
optionally some Sample Methods, if we want to use Filters (and which ones), support for
Mobile Connectors, and an additional Server Module.
As Filters we can add the Encryption (RSA + PC!) filters, or the ZLib compression filter.
Since we use HTTPS as supported by IIS, the encryption filters are not needed, and since
the data packages will be relative small, we also do not need compression at this time (but
this can always be added later, when needed).
The option for Mobile Connectors adds support for generating proxies for mobile clients
like Blackberry (Java), Android (Java), Windows Phone (C#) or iPhone (Objective C).
Finally, the Server Module option can be used to generate an extra module for the
DataSnap Server components (instead of using the Web Module for these components),
which adds support for the heavy weight callbacks.
In our case, we want both Authentication and Authorization, as well as a Server Methods
Class, but we don’t need any Sample Methods, since we can easily add our own custom
server methods. We also do not need Filters, Mobile Connectors or a separate Server
Module, so the server features can be selected as follows:
During the last step of the DataSnap WebBroker Application wizard, we can specify the
ancestor class for the Server Methods Class. Since we want to export both server methods
and datasets (using TDataSetProviders), a choice for the TDSServerModule is required
here. Using a TComponent ancestor class, we could define server methods to expose, and
using a TDataModule we could add non-visual components, but the TDSServerModule is
the ancestor class that also implements and exposes the IAppServerFastClass interface
methods using a TDSProviderDataModuleAdapter class behind the scenes.
If we now click on Finish, a new project is generated with two source files: a
ServerMethods unit and a unit with a web module inside. Save the project as DirtServer
(or any other name you want to give your D.I.R.T. project), the server methods unit in
DirtServerMethods.pas and the web module in DirtWebModule.pas.
Note that by renaming the Server Methods unit, the project won’t compile anymore. We
need to adjust the uses clause in the DirtWebModule.pas unit, changing
ServerMethodsUnit1 to DirtServerMethods.
We also need to adjust the assignment to PersistentClass in the OnGetClass event handler
in the same web module:
procedure TWebModule1.DSServerClass1GetClass(
DSServerClass: TDSServerClass;
var PersistentClass: TPersistentClass);
begin
PersistentClass := DirtServerMethods.TServerMethods1;
end;
With these two amendments we should be able to compile the DirtServer DataSnap Server
project, producing an ISAPI DLL. This means it’s time to start adding the real functionality,
starting with the authentication checks.
The next step involves the verification of the User and Password (which should actually
contain the hashed value of the password), making sure a corresponding record for this
combination exists in the User table.
For this, we can use a local TSQLConnection component with the sole purpose to perform a
direct SELECT command on the User table. Alternately, we can store the user names and
hashed password values in a configuration file or passwd.ini file for example at the server
machine, and read that file when the user tries to login.
Set its Driver property to MSSQL, and specify the Hostname, Database, UserName and
Password in order to make a valid connection to the database. Make sure to set the
LoginPrompt property to False, and toggle the Connected property to see if a successful
connection can be made.
object SQLConnection1: TSQLConnection
DriverName = 'MSSQL'
GetDriverFunc = 'getSQLDriverMSSQL'
LibraryName = 'dbxmss.dll'
LoginPrompt = False
Params.Strings = (
'HostName=.'
'Database=DIRT'
'User_Name=...'
'Password=...')
VendorLib = 'sqlncli10.dll'
end
Apart from the TSQLConnection to talk to the database, we also need a TSQLDataSet to
perform the query to look for the User and Password in the User table.
Local DB Connection
If you do not want to place the TSQLConnection and TSQLDataSet components on the web
module, you can also create them dynamically. Especially for situations where we only
need the TSQLConnection once (to verify the login), this feels like a “clean” solution.
Especially if you consider that the SQLConnection parameter settings can be read from a
.ini or config file (which is left as exercise for the reader, by the way).
In both cases, the SQL query that we need to perform does a SELECT of the UserID and
Role from the User table, passing the User and (hashed) Password in the WHERE clause.
Note that we do not have to hash the password here, since it will already have been sent
in hashed format (something the client will need to do).
If the SELECT command returns a record, then we can store the UserID value in the
current session, using the GetThreadSession class method of the TDSSessionManager type
(found in the DataSnap.DSService unit) to get the session for the current thread, calling
PutData to put the value of the UserID in a field with the name “UserID”. The associating
Role retrieved by the SELECT command can be used to add to the UserRoles collection.
procedure TWebModule1.DSAuthenticationManager1UserAuthenticate(
Sender: TObject; const Protocol, Context, User, Password: string;
var valid: Boolean; UserRoles: TStrings);
var
SQLConnection: TSQLConnection;
SQLQuery: TSQLQuery;
Role: String;
begin
valid := LowerCase(Protocol) <> 'http';
if valid then
begin // validate User and Hashed password
SQLConnection := TSQLConnection.Create(nil);
try
SQLConnection.LoginPrompt := False;
SQLConnection.DriverName := 'MSSQL';
SQLConnection.Params.Clear;
SQLQuery := TSQLQuery.Create(nil);
SQLQuery.SQLConnection := SQLConnection;
SQLQuery.CommandText :=
'SELECT UserID, Role FROM [User] WHERE ' +
' (Name = :UserName) AND (PasswordHASH = :Password)';
SQLQuery.ParamByName('UserName').AsString := User;
SQLQuery.ParamByName('Password').AsString := Password;
try
SQLQuery.Open;
if not SQLQuery.Eof then
begin
valid := True;
TDSSessionManager.GetThreadSession.PutData('UserID',
SQLQuery.Fields[0].AsString);
Role := SQLQuery.Fields[1].AsString;
if Role <> '' then
UserRoles.Add(Role)
end
else
valid := False
finally
SQLQuery.Close;
SQLQuery.Free
end
finally
SQLConnection.Connected := False;
SQLConnection.Free
end
end
end;
Make sure to add the Data.DB and Data.SqlExpr units to the uses clause if you want to
create the TSQLConnection and TSQLQuery components dynamically. Placing the
TSQLConnection component on the web module will automatically add these units to the
uses clause of the interface section, but if you add these components in code, then the
required units will not be added to the uses clause automatically.
Note that the current implementation of the On UserAuthenticate event only allows one
role per user. You can extend this either by adding more table to the data model (defining
Roles as well as the UserRoles connections) or by separating the roles using commas in
the Role field. This latter option can be implementing without adding more tables to the
data model, and consists of parsing the Roles string, looking for commas and adding both
the role name before and after the comma, as follows:
if Roles <> '' then
begin
while Pos(',',Roles) > 0 do
begin
CodeSite.Send('Role: ' + Copy(Roles,1,Pos(',',Roles)-1));
UserRoles.Add(Copy(Roles,1,Pos(',',Roles)-1));
Delete(Roles,1,Pos(',',Roles))
end;
if Roles <> '' then
begin
CodeSite.Send('Role: ' + Roles);
UserRoles.Add(Roles)
end
end
This will allow a user to have both the Developer and Tester role, for example, or both the
Manager and Tester role.
Authorization
Once authenticated, we must handle the authorization as well. DataSnap offers the ability
to connect user roles with server classes and server methods. The OnUserAuthorize event
of the DSAuthenticationManager component can be used to implement the way we want to
deal with authorization. Basically, there are two ways to handle authorization: the
optimistic or the pessimistic approach. Optimistic means that any operation is allowed,
unless explicitly forbidden. While pessimistic means that no operation is allowed, unless
explicitly allowed. The best technique to use depends on the type of application that you
want to build. In this case, security may be important (you do not want unauthorized
people to be able to edit or even view the details of issue reports or comments), so we
decided to implement the pessimistic approach.
In short, this means that in the OnUserAuthorize event handler, we should change the
default implementation (which assigns True to valid), and start with False instead. We
should then examine the collection of Authorized Roles to see if our User Role can be
found in the collection of Authorized Roles.
However, sometimes there are operation for which no explicit Authorized Roles have been
specified, like the generation of the DataSnap proxy classes for the client side. This
functionality would be unavailable if we implement the OnUserAuthorize with the previous
code snippet. For that reason, we should use an alternative approach: if no Authorized
Roles have been explicitly specified, then we can assume the operation is allowed, except
when the user role is found in the list of explicit Denied Roles (in practice, when the
Authorized Roles are empty, the Denied Roles will most often also be empty, by the way).
This leads to the final operational implementation of the OnUserAuthorize event handler
which is as follows (see next page):
if valid then
if Assigned(EventObject.DeniedRoles) then
for UserRole in EventObject.UserRoles do
if EventObject.DeniedRoles.IndexOf(UserRole) >= 0
then valid := False;
end;
Note that the above implementation also makes an explicit check to see if a user role was
included in both the Authorized Roles and the Denied Roles (in which case the user ends
up being denied for the requested server method).
The actual assignment of authorizations by role will need to be done at the server methods
level. We’ll make sure that admin has the right add new users, a manager can view all
issues but only in a read-only way, a tester can report new issues, and both the tester and
developer can add comments to issues (working on ways to solve the issue).
The GetCurrentUserID will be useful when we want to add a new issue report (or when we
want to comment on an existing issue), in which case our UserID is needed. The
GetCurrentUserRoles is even more useful, since that function will return the role (or roles)
that the current user belongs to, which we can use to open up functionality at the client
side.
Note that the UserRoles.Text will return zero, one or more roles in a single string, optinally
separated by CRLF pairs if we have more than one role.
The implementation of the GetCurrentUserID server method is only slightly more complex.
Using the same session, we can call the GetData method to return the value of the
‘UserID’ field (the one that we assigned when a user had a successful login). If the UserID
field does not exist or has an invalid value, we return -1 instead.
function TServerMethods1.GetCurrentUserID: Integer;
begin
Result := StrToIntDef(
TDSSessionManager.GetThreadSession.GetData('UserID'), -1)
end;
We won’t be using the UserID value for now, but it brings us to an interesting topic:
retrieving the list of users and/or adding new users.
With this SQL SELECT command specified, we can implement the GetUserNames server
method, which consists of opening the sqlUser table and moving to the First record (in
case it was already opened, but not positioned at the beginning).
function TServerMethods1.GetUserNames: TDataSet;
begin
try
sqlUser.Open;
sqlUser.First; // in case it already was opened
Result := sqlUser
except
on E: Exception do
begin
Result := nil;
sqlUser.Close;
SQLConnection1.Close
end
end;
end;
Note that we do not have to open the TSQLConnection component, since this will be
opened implicitly as soon as we open the TSQLDataSet. We do need to make sure that the
sqlUser is not closed before the function ends, since the “open” TSQLDataSet must be
passed as function result of this server method.
Getting a list of existing users is nice, but initially there will be no users. It’s time to
explore the server method to add a new user with a specific password and role to the
database.
The parameters for the AddUser procedure are the new user name, the password
(unencrypted, it will be hashed by the AddUser server method itself, although you are free
to change this and pass the already hashed version of the password along), the role and
the e-mail address.
The AddUser method also requires the TSQLConnection component, connected to the DIRT
database. We can implement the AddUser server method as follows:
procedure TServerMethods1.AddUser(const User, Password, Role, Email: String);
var
SQLQuery: TSQLQuery;
begin
SQLQuery := TSQLQuery.Create(nil);
SQLQuery.SQLConnection := SQLConnection1;
SQLQuery.CommandText := 'INSERT INTO [User] ' +
' (Name, PasswordHASH, Role, Email) ' +
' VALUES (:Name, :PasswordHASH, :Role, :Email)';
SQLQuery.ParamByName('Name').AsString := User;
SQLQuery.ParamByName('PasswordHASH').AsString := HashMD5(Password);
SQLQuery.ParamByName('Role').AsString := Role;
SQLQuery.ParamByName('Email').AsString := Email;
SQLConnection1.Open;
try
SQLQuery.ExecSQL;
finally
SQLConnection1.Close;
end;
end;
The HashMD5 support function is a private function defined in the TServerMethods1 class,
defined as follows:
private
{ Private declarations }
function HashMD5(const Str: String): String;
HashMD5 can be implemented as follows, using the Indy IdHashMessageDigest unit with
the TIdHashMessageDigest5 type:
function TServerMethods1.HashMD5(const Str: String): String;
var
MD5: TIdHashMessageDigest5;
begin
MD5 := TIdHashMessageDigest5.Create;
try
Result := LowerCase(MD5.HashStringAsHex(Str, TEncoding.UTF8))
finally
MD5.Free
end
end;
Note that you may want to make the function HashMD5 private (or protected) to ensure
that it’s not exported as server method if you do not want to share the MD5 hashing as a
server method.
Finally, note that you may also need to enter at least one user in the database with the
Admin role in order to be able to add other users (and the first user either needs to be
added by hand, or without the explicit custom attribute that defines that only users with
the Admin role can add users, otherwise you’re faced with a chicken-and-the-egg problem
and will not be able to login to add new users in the first place).
For the implementation of this server method, we not only need the TSQLConnection
component, but also a TSQLDataSet component called sqlReports.
A potential downside of using this simple SQL command is that we need to translate some
of the field values at the client side. For IssueType, Priority and Status that is not a big
problem. But for the ReporterID and AssignedTo values we would have to have a list of
usernames to substitute for these UserID values. This list is available by calling the
GetUserNames, but only for a user with the role Tester or Developer right now.
Since the SQL query should result in a read-only list anyway, we can extend it by joining
with the User table and replacing the ReporterID and AssignedTo fields with the actual
names of these users, as follows:
SELECT "ReportID", "Project", "Version", "Module", "IssueType",
"Priority", "Status", "ReportDate",
UReporterID.Name AS "Reporter", UAssignedTO.Name AS Assigned,
"Summary", "Report"
FROM "Report"
LEFT OUTER JOIN [User] UReporterID ON UReporterID.UserID = ReporterID
LEFT OUTER JOIN [User] UAssignedTO ON UAssignedTO.UserID = AssignedTO
WHERE Status >= :MinStatus AND Status <= :MaxStatus
ORDER BY Status
Note that I’ve used LEFT OUTER JOIN commands here, because the AssignedTo field of the
User table is not required (i.e. it can be a NULL value, in which case a normal LEFT JOIN
would not yield a result).
With this TSQLDataSet, we can implement the server method GetIssues as follows,
passing the MinStatus and MaxStatus integer argument values to the MinStatus and
MaxStatus query parameters:
function TServerMethods1.GetIssues(MinStatus,MaxStatus: Integer): TDataSet;
begin
try
SQLConnection1.Open;
sqlReports.Close;
sqlReports.ParamByName('MinStatus').Value := MinStatus;
sqlReports.ParamByName('MaxStatus').Value := MaxStatus;
sqlReports.Open;
Result := sqlReports
except
on E: Exception do
begin
Result := nil;
sqlReports.Close;
SQLConnection1.Close
end
end;
end;
Note that we must not close the sqlReports, but allow the TSQLDataSet to remain open
while the function is executed. At the client side, we’ll be able to work with the resulting
dataset, as we’ll see shortly.
Note that we pass all fields of the Report table, except for the ReportID (which is an
identity, or autoincrement field), the ReporterID, and the Status fields. The ReporterID can
be determined by the server method based on the logged in UserID, and the status field
will automatically get the value 1 for “reported”. The AssignedTo field is optional, with a
default parameter value of -1. This value means that the reported issues is not yet
assigned to a user, of course. Based on the value of AssignedTo, the SQL INSERT
command will be slightly different (either with or without the AssignedTo field and
parameter).
begin
Result := False;
InsertSQL := TSQLQuery.Create(nil);
try
InsertSQL.SQLConnection := SQLConnection1;
try
InsertSQL.ParamByName('Project').AsString := Project;
InsertSQL.ParamByName('Version').AsString := Version;
InsertSQL.ParamByName('Module').AsString := Module;
InsertSQL.ParamByName('IssueType').AsInteger := issueType;
InsertSQL.ParamByName('Priority').AsInteger := Priority;
InsertSQL.ParamByName('ReporterID').AsInteger :=
StrToIntDef(TDSSessionManager.GetThreadSession.GetData('UserID'),0);
if AssignedTo >= 0 then
InsertSQL.ParamByName('AssignedTo').AsInteger := AssignedTo;
InsertSQL.ParamByName('Summary').AsString := Summary;
InsertSQL.ParamByName('Report').AsString := Report;
Result := InsertSQL.ExecSQL = 1
except
on E: Exception do
CodeSite.SendException(E)
end
finally
InsertSQL.Free;
SQLConnection1.Close
end;
end;
The function will return True in case the new record has been inserted correctly. Note that
the error handling only consists of sending the exception to CodeSite at this time. Feel free
to re-raise the exception or add your own error handling.
At this point, we’ve added no less than 6 server methods to retrieve information
(GetCurrentUserID, GetCurrentUserRoles, GetUserNames, GetIssues) or add new users
(AddUser) or reports (ReportNewIssue). Two methods return datasets, but in a read-only
way: the GetUserNames and GetIssues produce a dataset that cannot be modified.
In order to allow users with the Developer role to retrieve reports and add comments, we
need to add functionality to the server module that goes beyond that which the server
methods can offer.
A developer working on an issue report should see not just the initial issue report, but also
all comments that have been posted for that particular issue. As a result, the
TDataSetProvider should export not only records from the Reports table, but also from the
Comments table, in a so-called master-detail relationship.
Note that this will return both the issues reported by the current user, and the issues
assigned to the current user. Which can both be considered issues “belonging” to the
current user.
This SQL command returns 12 fields. We can double-click on the sqlReportUser component
to start the Fields Editor. Next, we can right-click in the Fields Editor and select “Add All
Fields”. This will produce a list of the 12 fields that the query will produce:
Some of these fields require special treatment. The ReportID field for example, is an
autoincrement field. This means that we should modify the ProviderFlags property for this
fields. The pfInUpdate subproperty must be set to False (since we cannot assign an initial
value to this primary key, and should never update it anyway), but the pfInKey should be
set to True. The pfInWhere should already be set to True.
Note that the SQL command has four parameters: MinStatus, MaxStatus, AssignedTo and
ReporterID. However, only two of them will be provided by the client, and two will be
automatically filled by the server itself (and as a consequence need to be removed from
the list).
The MinStatus and MaxStatus input parameters are of type ftInteger and should be kept.
The AssignedTo and ReportedID parameters should be removed from the Params
collection, so they will not show up at the client side (and can be filled in by the server
side itself, based on the UserID value in the session of the current user).
Moving on to the detail: the sqlComments dataset should point its DataSource property to
dsReportUser, pointing to the master sqlReportUser. The actual SQL command for the
comments details is as follows, using a parameter called ReportID that connects the
comments to the reports:
This time, we must again modify the ProviderFlags for the primary key CommentID. So,
double-click on the sqlComments component to start the Fields Editor, right-click in the
Fields Editor and select “Add All Fields”.
Select the CommentID field, and in the Object Inspector, expand the ProviderFlags
property of the CommentID field. Set the pfInUpdate subproperty to False, and the
pfInKey subproperty to True (while pfInWhere should already be True as well). This should
ensure that the primary key CommentID is never sent to the server as assignment, but is
used in the WHERE clause to locate a record.
Since the sqlComments dataset is connected to the master sqlReportUser dataset, the
TDataSetProvider will export them both as so-called nested datasets, with the report being
the master and the comments (if any) as nested detail dataset. We’ll see that in the client
application in more detail.
Finally, note that there is also an upWhereKeyOnly option, but that’s dangerous, since it
only passes the primary key in the WHERE clause of UPDATE and DELETE commands,
ignoring any changes that other users may have done.
Since both datasets are read-only unidirectional datasets, we need to perform one final
task to allow the master-detail relationship to work. While stepping through the resultset
of the master, each time we move one record further next in the master recordset, we
must “reset” the detail. This does not happen automatically, because the detail is also a
unidirectional dataset. In the AfterScroll event of the sqlReportUser we can close and re-
open the sqlComments dataset, and also pass the new value of the ReportID parameter.
This is implemented as follows:
procedure TServerMethods1.AS_sqlReportUserAfterScroll(DataSet: TDataSet);
begin
sqlComments.Close;
sqlComments.ParamByName('ReportID').AsInteger :=
sqlReportUser.FieldByName('ReportID').AsInteger;
sqlComments.Open;
end;
This will in effect ensure that the detail query is re-executed for every master record as
soon as we scroll to that master record, filling the entire nested dataset at the server side
before it’s exported by the TDataSetProvider to the client.
Note that the event was named AS_sqlReportUserAfterScroll and not the default name
sqlReportUserAfterScroll. This is done on purpose. The reason is as follows: any public
method of a server method class, will be exposed. And this includes public event handlers.
However, there is one exception to this rule: methods that start with the AS_ prefix will be
hidden and not exposed from the server methods class. We can use this knowledge to hide
event handlers, so they won’t resurface in the server methods proxy unit at the client side
(since they are quite useless to call from the client anyway).
While the client side will supply the value for the MinStatus and MaxStatus parameters of
the master query, we have to enter the values for the AssignedTo and ReporterID
parameters by ourselves at the server side. This can be done by implementing the
BeforeGetRecords event handler of the TDataSetProvider. Here, we can fill the parameters
of the sqlReportUser TSQLDataSet before its opened.
Both the ReporterID and AssignedTo parameter values can be assigned to the value of the
UserID for the current user that we stored in the current thread for the user right after the
login (remember?). This is implemented as follows:
procedure TServerMethods1.AS_dspReportUserWithCommentsBeforeGetRecords(
Sender: TObject; var OwnerData: OleVariant);
var
UserID: Integer;
begin
UserID := StrToIntDef(TDSSessionManager.GetThreadSession.GetData('UserID'),0);
This will ensure that all parameters of the sqlReportUser are filled before the query is
opened, and as a consequence any user who requests the Report and Comments records
from the dspReportUserWithComments will only get the reports where the current used
was either the reporter (i.e. ReporterID) or the one assigned to it (i.e. AssignedTo), or
both. But you can only see “your reports”, and not reports that belong to someone else.
For that, you need to have the Manager role and call the GetIssues server method that will
return an overview of all reported issues (but without the comments).
This component has a property called Roles, where we can defines a collection of Roles
with ApplyTo, AuthorizedRoles and DeniedRoles properties. The ApplyTo property is the
place where we can specify a method name, a class name, or a class name followed by a
dot followed by a method name. In our case, I want to specify that only users with the role
Developer should be allowed to call the 7 pre-defined IAppServerFastCall methods. So in
the Apply property we need to add these 7 methods as follows:
Note that we could prefix them with TServerMethods1. in order to indicate that we only
want to secure the 7 methods from that specific server methods class.
Once the Apply property has been specified, we can enter Developer as role for the
AuthorizedRoles property, specifying that only a user with the Developer role is allowed to
use the exported TDataSetProvider.
You may feel that it’s a bit too strict to allow only the developer to edit existing issue
reports and add comments. Feel free to add the Tester and even the Manager roles to this
list as well (although it depends on the manager if you really want to give this user the
ability to modify reports and/or add comments).
Server Deployment
For the DataSnap D.I.R.T. Server to be safely and securely deployed, we either need to
deploy the ISAPI DLL in Microsoft’s Internet Information Services using the HTTPS
protocol, or we should deploy it as stand-alone DataSnap server with the RSA and PC1
filters to encrypt the transport channels. In the latter case, it’s recommended to use
TCP/IP as transport protocol and not HTTP, since TCP/IP is a lot faster. So the real-world
deployment options that we’ve used are:
- ISAPI DLL using HTTPS on IIS (see also section 4)
- Stand-alone using TCP/IP with PC1 and RSA filters
IIS has the additional benefit of allowing us to configure application pools that support
automatic load balancing, recycling, and limitation of the CPU and memory usage of the
server. For this reason, the D.I.R.T. DataSnap Server has been deployed as an ISAPI DLL
on IIS on a web server that has an SSL certificate installed for the domain
www.bobswart.nl that we use for this purpose. A virtual directory DataSnapXE2 has been
created, and a special application pool configured for the applications in this virtual
directory. Finally, the DirtServer.dll has been placed inside this virtual directory, resulting
in the following URL: https://www.bobswart.nl/DataSnapXE2/DirtServer.dll
Calling this URL will only present us with the “DataSnap XE Server” text, but at least that
is confirmation enough that the server is deployed. Of course, we should also ensure that
the DirtServer.dll can connect to the database. For that purpose, the required database
driver should also be deployed to the server machine. For SQL Server 2008, this means
the dbxmss.dll which should be placed in the search path (like the Windows\System32
directory, to ensure that the ISAPI DLL can load it).
Note that apart from the DirtServer.dll and the dbxmss.dll we do not have to deploy any
other files to the web server. The MIDAS.dll is not needed either, since we can add the
MidasLib unit to the uses clause of the project, which will link in the MIDAS.dll inside the
server project itself.
If you deploy the DataSnap standalone server, using TCP/IP and the RSA and PC1 filters,
then you must also deploy two Indy specific SSL DLLs: libeay32.dll and ssleay32.dll – or
make sure they already exist at the server machine. These DLLs are needed for the RSA
filter (which encrypts the password used by the PC1 filter). Without these two DLLs, any
client who wants to connect to the server will get an “Connection Closed Gracefully”
message, because the server was unable to load the two DLLs to start the RSA filter to
encrypt the PC1 keys, etc.
By the way, the same two DLLs will be required for any DataSnap client, whether
connected to the TCP/IP server using the RSA and PC1 filters, or whether connected to the
ISAPI filter using HTTPS. But let’s first build the client, and worry about deployment later.
DataSnap Client
With the D.I.R.T. DataSnap Server deployed and available for testing, we can start to write
the DataSnap Client. This will be a smart client; a standalone executable that we can put
on a USB stick or place anywhere and can run from any Windows machine (provided we
have an internet connection to the https://www.bobswart.nl server to access the
DirtServer project.
Create a new VCL Forms application, and add a Data Module to centralize the data access
components to the DataSnap Server. As first step, we should place a TSQLConnection
component on the Data Module. Set the Driver property to Datasnap, and then expand the
Driver node to view the DataSnap specific properties in the Object Inspector. We should
first set the CommunicationProtocol to https, the Port to 443 and the HostName to
www.bobswart.nl (feel free to connect to your own server, of course). The URLPath
property should be set to /DataSnapXE2/DirtServer.dll which specifies the location of the
DirtServer on the web server. Don’t forget to set the LoginPrompt property to False, or
you will be presented with a login dialog when you try to connect to the server. And that’s
not useful, since we need to pass the hashed password (instead of the real password) from
the client to the server.
There is one catch here… without a valid value for the DSAuthUser and DSAuthPassword
properties, we are not allowed to connect to the server, and as a consequence the
generation of the DataSnap Client Classes will fail. For the DirtServer on my machine, that
will be a problem (but you can use the pre-generated client classes unit
DBXClientClasses.pas from the project archive if you want). For your own DataSnap
server, you could temporarily disable the OnUserAuthenticate event handler of course,
assigning True to valid at least for as long as you need to generate your client classes, and
then enable the authentication checks again.
Once the clent classes have been generated, you’ll see the TServerMethods1Client class
which consists of the following public server methods (among others):
type
TServerMethods1Client = class(TDSAdminClient)
private
....
public
constructor Create(ADBXConnection: TDBXConnection); overload;
constructor Create(ADBXConnection: TDBXConnection;
AInstanceOwner: Boolean); overload;
destructor Destroy; override;
But before we can create an instance of the TServerMethods1Client and call the server
methods, we should first ensure we can make a connection.
Login
We can add a new form to the project and design it to be our custom login form, for
example as follows:
Make sure to set the PasswordChar property of the TEdit (called edPassword) used to
enter the password.
This dialog can be displayed in a modal way, and the resulting username and password
can be assigned to the TSQLConnection’s Params, adding or replacing the existing values.
We should hash the password the same way we did in the server application, leading to
the following code for a Login event:
procedure TFormClient.Login1Click(Sender: TObject);
var
MD5: TIdHashMessageDigest5;
Server: TServerMethods1Client;
UserRoles: String;
begin
DataModule1.SQLConnection1.Connected := False;
UserID := -1;
with TFormLogin.Create(Self) do
try
if ShowModal = mrOK then
begin
SQLConnection1.Params.Values['DSAuthenticationUser'] := Username;
MD5 := TIdHashMessageDigest5.Create;
try
SQLConnection1.Params.Values['DSAuthenticationPassword'] :=
LowerCase(MD5.HashStringAsHex(Password, TEncoding.UTF8));
Note that I haven’t shown my actual code to disable and/or hide portions of the GUI based
on the user roles, but this is left as exercise for the reader. You can download the example
DirtServer and DirtCleaner client projects and examine the source code in more detail of
course.
The Login form will close the existing connection to the DataSnap Server and assign new
values to the DSAuthenticationUser and DSAuthenticationPassword parameters of the
TSQLConnection component. Whether or not the login was successful can be seen if the
connection is re-established, followed by the call to the GetCurrentUserID and
GetCurrentUserRoles server methods.
For now, return to the data module, and place four more components on the data module:
a TsqlServerMethod component, called SqlServerMethodGetIssues, a TDataSetProvider
component called dspIssues, a TClientDataSet component called cdsIssues, and finally a
TDataSource component named dsIssues.
You can define a default value for the MinStatus and MaxStatus parameters, or assign
their values in code (which is what the next code snippet will show).
The dspIssues TDataSetProvider needs to connect his DataSet property to the
SqlServerMethodGetIssues.
Next, we need to point the ProviderName property of the cdsIssues TClientDataSet to
dspIssues. Since the resulting dataset cannot be modified, it may be a good idea to set the
ReadOnly property of the cdsIssues to True, so any data-aware controls connecting to this
dataset will not allow the user to make any changes (which is frustrated if you find out
later that the changes cannot be saved or send back to the server).
Finally, the dsIssues TDataSource needs its DataSet property assigned to the cdsIssues
TClientDataSet, so we can connect data-aware controls to the dsIssues.
Now, on the client main form, we can place a TDBGrid and name it TDBGridReports, to be
filled with the reports from this or other server method. In order to put the result of the
GetIssues server method, passing explicit values for the MinStatus and MaxStatus filters,
we can write the following code:
procedure TFormClient.ViewAllIssues1Click(Sender: TObject);
// Manager - all reports (read-only)
begin
DBGridReports.DataSource := nil;
DataModule1.SQLConnection1.Connected := True;
try
DataModule1.cdsIssues.Close;
DataModule1.SqlServerMethodGetIssues.Params.
ParamByName('MinStatus').AsInteger := MinStatus;
DataModule1.SqlServerMethodGetIssues.Params.
ParamByName('MaxStatus').AsInteger := MaxStatus;
DataModule1.cdsIssues.Open;
DBGridReports.DataSource := DataModule1.dsIssues;
finally
DataModule1.SQLConnection1.Connected := False
end
end;
Note that we open the connection to the DataSnap Server using the TSQLConnection
component on the data module, we fill the SqlServerMethodGetIssues parameters, call the
server method, assign the datasource to the DBGridReport’s datasource so we can see the
result, and finally we close the connection to the DataSnap Server again. Although you can
keep the connection open, disconnecting right after your request will help make the client
a bit more robust (you will always start with a fresh connection) although also a bit slower
(you always need to open that connection), but the server a bit more scalable (only active
connections need to be maintained).
The resulting dataset will contain fields like IssueType, Priority and Status, presenting
themselves as integer values. We can map these integer values into more human-friendly
string values. For this, we need to double-click on the cdsIssues TClientDataSet on the
data module, to start the Fields Editor. Right-click inside the Fields Editor and Add all
Fields.
For the IssueType, Priority and Status fields, we can implement the OnGetText event
handler to change the simple integer values into more human readable string values, as
follows:
end;
end;
Feel free to add your own translations to these fields, but for our purpose it was enough.
Note that the original integer value is still presented in the IssueType and Priority field
values as well, as a little reminder what the actual value was.
Using the Fields Editor, we can verify that cdsUserNames has two fields: Name and
UserID. This dataset will be used as “support” dataset to provide a list of names, or to
change a UserID value (for example from the AssignedTo or ReportedID fields) from an ID
to a readable name.
DSProviderConnection
Let’s now move to the reported issues for a user with the Developer or Tester role. In that
case, we cannot call the GetIssues server method, but we should use the exported
TDataSetProvider called dspReportUserWithComments from the server.
For this, we need to add a TDSProviderConnection component on the data module, as well
as two TClientDataSets – called cdsReports and cdsComments, and one TDataSource
component called dsReports connected to the cdsReports.
You can assign a default value to these parameters, but we’ll assign an explicit value to
them in code shortly.
We also need to double-click on cdsReports to start the Fields Editor, and right-click inside
the Fields Editor to Add all Fields. This will result in a list of 13 fields with one special field
at the end: sqlComments.
This particular field contains the nested detail records, in this case “comment” records,
that belong to the master Report record. We can use this field of type TDataSetField to
feed the second TClientDataSet called cdsComments.
Select the cdsComments TClientDataSet, and using the Object Inspector, assign
cdsReportssqlComments to the DataSetField property of cdsComments. The cdsComments
dataset will now automatically contain the detail records for the current master record in
the cdsReports dataset.
We should also double-click on the cdsComments TClientDataSet to start the Fields Editor
and right-click in it to Add all Fields.
Both TClientDataSets need a little modification with regards to their first field; the primary
key. Since I’ve specified the ReportID (in the Reports table) and the CommentID (in the
Comments table) to be autoincrement fields, we should not write these values to the
database. The DBMS itself will generate these values for us. As a consequence, for both
the cdsReportsReportID field and the cdsCommentsCommentID field, we must edit the
ProviderFlags and set the pfInUpdate subflag to False but the pfInKey subflag to True. This
will ensure that the TClientDataSets know that the primary key is in fact the key, but that
it should not be updated or written to the server.
We can do the same thing for an AssignedName lookup field for the AssignedTo field in the
cdsReports. And don’t forget the UserID field in the cdsComments, which could also
benefit from a lookup field called UserName.
Autoincrement Fields
A few more steps are required to make the autoincrement fields work, especially in a
master-detail relationship. Although we cannot write a value for the ReportID and
CommentID primary keys back to the server, we still need to assign a value at the client
side to allow us to create new issues or new comments. For these new records, we should
use negative key values, starting to count down from -1. This can be implemented using
the AfterInsert event of the two TClientDataSets as follows:
procedure TDataModule1.cdsReportsAfterInsert(DataSet: TDataSet);
begin
cdsReportsReportID.AsInteger := -1;
end;
Note that the code here simply assigns -1 and does not attempt to count further down.
Counting further down is not needed, since the client application will send all updates back
to the server right after a post or delete. This means that we will never have more than
one (currently being edited) record with a possible negative key value. Right after the post
(after insert or edit), the new record will be send to the server, and all we need to do is
refresh the contents of the cdsReports and cdsComments in order to retrieve the actual
autoincrement values from the server. This can be done with a call to cdsReports.Refresh.
For that purpose, we need to implement the AfterDelete and AfterPost event handlers of
both TClientDataSets. The good news is that all four event handlers can be shared in one,
since in all cases there’s just one thing that we need to do:
procedure TDataModule1.cdsReportsOrCommentsAfterPostOrDelete(DataSet: TDataSet);
begin
cdsReports.ApplyUpdates(-1);
cdsReports.Refresh
end;
This will ensure that we call the ApplyUpdates to send any changes to the server, and right
after that call the Refresh method to refresh the data from the server.
Refresh will not just refresh the primary key values, but will actually refresh all data in the
records that we have selected. This may sound like a bad idea, and it may certainly be
unwise if you select a large number of records, but in this case the exported
DataSetProvider will only return the Reports (with Comments) that belong to the current
user, within the specified MinStatus and MaxStatus ranges, so if that produces thousands
and thousands of records, then perhaps this user is either a fanatic when it comes to
testing, or a developer who has too much work on his or her plate…
Viewing the issues, for a user with the role Developer or Tester, can now be implemented
as follows:
procedure TFormClient.ViewMyIssues1Click(Sender: TObject);
// Developer/Tester personal reports
begin
DBGridReports.DataSource := nil;
DataModule1.SQLConnection1.Connected := True;
try
DataModule1.cdsReports.Close;
DataModule1.cdsReports.Params.ParamByName('MinStatus').AsInteger :=
MinStatus;
DataModule1.cdsReports.Params.ParamByName('MaxStatus').AsInteger :=
MaxStatus;
DataModule1.cdsReports.Open;
DBGridReports.DataSource := DataModule1.dsReports;
// nested dataset
DBGridReports.Columns[DBGridReports.Columns.Count-1].Visible := False;
finally
DataModule1.SQLConnection1.Connected := False
end
end;
The last column in the TDBGrid is the nested dataset. We can either set the Visible
property of that field to false in the Fields Editor of the cdsReports, or set Visible to false
for the last column in the TDBGrid. The code above implements the latter choice.
Note that apart from that nested field column, the code to display the reported issues is
very similar to what was needed for the Manager’s call to the GetIssues server method,
exposed by the SqlServerMethodGetIssues. However, where the Manager’s result is a
read-only dataset (actually set to read-only using the property ReadOnly), the Developer
and Tester’s result is a dataset where we can make modifications.
For these modifications, however, we do not want users to edit data in a grid, so a new
form was designed to allow editing and updating of individual issue reports.
Since the Version and Module fields are sometimes left empty, and this form presents the
data in a read-only way, I decided to add some code to hide the Version and Module fields
in case they are empty. This code can be added to the FormCreate, and is as follows:
procedure TFormIssue.FormCreate(Sender: TObject);
begin
DBEditVersion.Visible := DBEditVersion.Text <> '';
lbDBEditVersion.Visible := DBEditVersion.Text <> '';
DBEditModule.Visible := DBEditModule.Text <> '';
lbDBEditModule.Visible := DBEditModule.Text <> '';
end;
The comments are displayed in a TDBGrid, with the current comment details shown in the
second TDBMemo. Adding a comment should not be done in the grid (although this is
possible), but is nicer in a new form. For that, we’ve designed a special form to enter the
new comments, and display it as follows:
procedure TFormIssue.btnNewCommentClick(Sender: TObject);
begin
with TFormComments.Create(Self) do
try
ShowModal;
finally
Free
end
end;
The new form itself has a very simple layout again, using two buttons and a TMemo with
the vertical scrollbar option enabled.
Note that only the actual comments are needed to insert a new comment record, since the
client application knows the (current) report we want to comment on, it knows the current
user, as well as the date, so the only field that needs “manual” input is the actual
comment itself.
As a result, the implementation of the OK button is as follows:
procedure TFormComments.BitBtn1Click(Sender: TObject);
begin
DataModule1.cdsComments.Insert;
DataModule1.cdsCommentsCommentID.AsInteger := -1;
DataModule1.cdsCommentsUserID.AsInteger := UserID;
DataModule1.cdsCommentsCommentDate.AsDateTime := Now;
DataModule1.cdsCommentsComment.AsString := Memo1.Text;
DataModule1.cdsComments.Post;
Close
end;
Which will also close the dialog, returning the user back to the overview with a refreshed
contents of the reports and comments.
Obviously, this is not enough if we also want to modify the state of the reported issue, for
example by changing the state from “Assigned” to “Opened” or “Tested”. For this, we need
a drop-down combobox filled with the possible values for the Status field (Reported,
Assigned, Opened, Solved, Tested, Deployed and Closed):
The implementation of the Button Click also needs to take the new value into account,
posting the new value into the cdsReports table. However, since the TComboBox only
contains the literal string items, we have to “translate“ these to the actual numerical
values in the database (see also page 141 for the enumeration of the Status values).
procedure TFormComments.BitBtn1Click(Sender: TObject);
begin
if Length(Memo1.Text) > 1 then
begin
DataModule1.cdsComments.Insert;
DataModule1.cdsCommentsCommentID.AsInteger := -1;
DataModule1.cdsCommentsUserID.AsInteger := UserID;
DataModule1.cdsCommentsCommentDate.AsDateTime := Now;
DataModule1.cdsCommentsComment.AsString := Memo1.Text;
DataModule1.cdsComments.Post
end;
DataModule1.cdsReports.Edit;
case cbStatus.ItemIndex of
0..2: DataModule1.cdsReportsStatus.AsInteger := cbStatus.ItemIndex+1;
3..5: DataModule1.cdsReportsStatus.AsInteger := cbStatus.ItemIndex+3;
else DataModule1.cdsReportsStatus.AsInteger := 10
end;
DataModule1.cdsReports.Post;
Close
end;
I leave it as exercise for the reader to check if the value of the Status field has indeed
changed before editing and posting the cdsReports TClientDataSet (which may not always
be needed).
The cdsComments.Post should normally also post the changes to the server, however,
since this is followed by a cdsReports.Post, we can make sure we only apply the updates in
the post of the cdsReports. This means that we can “unhook” the OnAfterPost and
OnAfterDelete event handler of the cdsComments, making sure we only call ApplyUpdates
once (after the post of the cdsReports).
Obviously, that strategy cannot be used in combination with the above user-exercise to
edit and post the cdsReports only if the Status field has actually changed value. Again, I
leave it as exercise for the reader to figure our how the most optimal edit-post sequence
will look (with only a single call to ApplyUpdates at the end).
Let’s take a look back to the Issue Form. The field for the assigned user is not required.
This means that the field may be NULL, so it should be possible for the field to be assigned
after the report has been created. For that purpose, I’ve added a TDBLookupCombobox to
the Issue Form (on top of the TDBEdit, and invisible by default), pointing to the
AssignedTo field of the dsReports datasource. The combobox is filled with the available
users in the database, by using the dsUserNames on the data module as ListSource, with
the Name field as ListField, and the UserID field as KeyField.
A TBitBtn with the caption “Assign” will do the rest, by making the TDBLookupCombobox
visible, allowing the user to select a User and post the changes without another click to the
same button:
if btnAssign.Caption = 'Assign' then
begin
DataModule1.cdsReports.Edit;
DBLookupComboBox1.Visible := True;
btnAssign.Caption := 'Post';
end
else
begin
DataModule1.cdsReports.Post;
DBLookupComboBox1.Visible := False;
btnAssign.Caption := 'Assign';
end;
Note that we should also set the Status to “Assigned” in case the Report was only
“Reported”, which is done as follows:
And the Issue Report form now looks as follows at design-time, using the background color
$00E7E7E7 to indicate that the fields are read-only by default:
For a user in the role of Manager, the New Comment button (and label to explain the
functionality) will be made invisible, just as the Assign button, so the manager can really
only look at the issue report details, but not edit them.
A final feature enabled the user to close this form by pressing Escape. For that, we need to
set the KePreview property of the Issue Form to True, and implement the OnKeyUp (or
OnKeyDown) event handler as follows:
procedure TFormIssue.FormKeyUp(Sender: TObject; var Key: Word;
Shift: TShiftState);
begin
if Key = VK_ESCAPE then
Close
end;
This will close the form as soon as the user hits Escape. Note that since this is not an input
form, it should not destroy any unsaved changes (which might happen if you also
implement this OnKeyUp event in the new comment form, for example).
However, we can also use the existing cdsReports TClientDataSet, connected to the
DSProviderConnection to simply insert a new record in the cdsReports table, with -1
assigned to the primary key (automatically), and using data-aware controls to design the
GUI. Thereby ignoring the ReportNewIssue server method (which was only allowed for
Testers anyway), and using the dataset provider functionality which was enabled for both
Testers and Developers.
The data-aware Report New Issue form is defined as follows, with a single TDataSource
connected to the cdsReports on the data module, and a set of data-aware controls.
During the FormCreate, we can do an Insert of the cdsReports dataset, assigning the
UserID to ReporerID, the Status as reported (value 1), as well as the ReportDate (as
Now). The ReporterID, Status and ReportDate fields are made read-only in this form.
The OK button will do a Post and close the form, while the Cancel button will do a Cancel
and close the form. Implemented as follows:
Note that the IssueType, Priority, and AssignedTo fields are assigned using simple TDBEdit
controls. It would be a good idea to use a numerical spinedit control for the IssueTypes
and Priorities (limiting the input to a range of 1..10), and a drop-down combobox showing
the known users for the AssignedTo field. Since the AssignedTo field is not required, we
cannot use a TDBLookupComboBox component (which would force us to always make a
choice, even if we do not want to assign the issue report to a user, yet).
So, instead I’ve used a regular TCombobox filled with the names of the users in the
FormCreate event, with the following additional code (see next page):
DataModule1.cdsUserNames.Open; // in case it was closed
DataModule1.cdsUserNames.First;
cbAssignedTo.Items.Clear;
cbAssignedTo.Items.Add('Nobody');
cbAssignedTo.Items.AddObject(
DataModule1.cdsUserNamesName.AsString,
TUserID.Create(DataModule1.cdsUserNamesUserID.AsInteger));
DataModule1.cdsUserNames.Next;
end;
The TUserID class is a special support class that I’ve created to hold the UserID so we can
add this object as second argument to the AddObject method of the ComboBox’s Items.
type
TUserID = class
UserID: Integer;
constructor Create(const NewUserID: Integer);
end;
To avoid a memory leak, we should also cleanup the objects from the TComboBox again
when the form is closed, which can be done as follows in the FormClose:
Note that we’re also assigning the current user’s name to the Text property of a TEdit
called edReporter, so we see the name of the person who submitted the report (which
should be our own name).
This results in the following design of the Report New Issue form:
Note that the IssueType and Priority TSpinEdit controls have a default value of 1, and a
MinValue of 1 and MaxValue of 10. Feel free to change the default Value to for example 5
if you want, to make it easier to report more urgent issues.
As optional extension, we could also add TComboBox controls for the Project, Version and
Module input fields, retrieving a list of existing values for the Project, Version and Module
fields in the current dataset. This requires an extension of the DIRT Server as well, with
two new server methods for the Tester and Developer roles:
[TRoleAuth(RoleTester + ',' + RoleDeveloper)]
function GetProjectNames: String;
[TRoleAuth(RoleTester + ',' + RoleDeveloper)]
function GetModuleNames(const Project: String): String;
Note that the GetModulesNames takes a parameter to return only the modules that were
used for a specific project (to avoid a list of all possible module names, including those
who belong to a different project).
In order to avoid having to place another TSQLQuery component on the Server Methods
container, we can dynamically create one, retrieve all unique project names, and return
the string before destroying the local TSQLQuery component again. In code, this is
implemented as follows:
function TServerMethods1.GetProjectNames: String;
var
SelectSQL: TSQLQuery;
begin
CodeSite.EnterMethod('GetProjectNames');
Result := '';
SelectSQL := TSQLQuery.Create(nil);
try
SelectSQL.SQLConnection := SQLConnection1;
SelectSQL.CommandText := 'SELECT DISTINCT Project FROM Report ORDER BY Project';
try
SelectSQL.Open;
SelectSQL.First;
while not SelectSQL.Eof do
try
Result := Result + SelectSQL.Fields[0].AsString + #13#10;
finally
SelectSQL.Next
end
except
on E: Exception do
CodeSite.SendException(E)
end
finally
SelectSQL.Free;
SQLConnection1.Close
end;
CodeSite.ExitMethod('GetProjectNames')
end;
The result of this Server Method can be assigned to the Items.Text property of the
Projects TComboBox. In a simlar way, we can implement the GetModuleNames, passing
the associated Project Name as parameter.
function TServerMethods1.GetModuleNames(const Project: String): String;
var
SelectSQL: TSQLQuery;
begin
CodeSite.EnterMethod('GetModuleNames');
Result := '';
SelectSQL := TSQLQuery.Create(nil);
try
SelectSQL.SQLConnection := SQLConnection1;
SelectSQL.CommandText := 'SELECT DISTINCT Module FROM Report ' +
' WHERE Project LIKE :Project ORDER BY Module';
SelectSQL.Params.Clear;
with SelectSQL.Params.AddParameter do
begin
Name := 'Project';
DataType := ftString;
ParamType := ptInput;
AsString := Project;
end;
try
SelectSQL.Open;
SelectSQL.First;
while not SelectSQL.Eof do
try
if (not SelectSQL.Fields[0].IsNull) and (SelectSQL.Fields[0].AsString <> '') then
Result := Result + SelectSQL.Fields[0].AsString + #13#10;
finally
SelectSQL.Next
end
except
on E: Exception do
CodeSite.SendException(E)
end
finally
SelectSQL.Free;
SQLConnection1.Close
end;
CodeSite.ExitMethod('GetModuleNames')
end;
Showing he New Issue Form now includes the assignment of the GetProjectNames to the
Text property of the DBComboBox1.Items, as follows:
procedure TFormClient.ReportnewIssue1Click(Sender: TObject);
var
Server: TServerMethods1Client;
begin
with TFormNewIssue.Create(Self) do
try
DataMod.DataModule1.SQLConnection1.Connected := True;
Server := TServerMethods1Client.Create(DataMod.DataModule1.SQLConnection1.DBXConnection);
try
DBComboBox1.Items.Text := Server.GetProjectNames;
// DBComboBox2.Items.Text := Server.GetModuleNames;
finally
Server.Free;
DataMod.DataModule1.SQLConnection1.Connected := False;
end;
ShowModal
finally
Free
end
end;
Note that we do not assign the result of the GetModuleNames server method call to the
second DBComboBox, since this one should be called with the current selected project as
parameter. Instead of guessing one here, we can implement that using the OnChange
event in the TFormNewIssue itself:
procedure TFormNewIssue.DBComboBox1Change(Sender: TObject);
var
Server: TServerMethods1Client;
begin
if DBComboBox1.ItemIndex >= 0 then
try
DataMod.DataModule1.SQLConnection1.Connected := True;
Server := TServerMethods1Client.Create(DataMod.DataModule1.SQLConnection1.DBXConnection);
try
DBComboBox2.Items.Text := Server.GetModuleNames(DBComboBox1.Items[DBComboBox1.ItemIndex]);
finally
Server.Free;
DataMod.DataModule1.SQLConnection1.Connected := False;
end
except
on E: Exception do
ShowMessage(E.Message)
end
end;
Note that for a remote connection, this call make take a split second to return, so you may
notice a sluggish GUI here. To improve performance you may want to create an instance
of the TServerMethods1Client when the TFormNewIssue is created, and use it during the
lifetime of this form, freeing it only when the form is closed again. That way, you do not
have to create and free the server instance in every OnChange event.
The result can be seen below, where a choice for “DIRT” leads to two possible Modules for
example:
First of all, the sorting. The easiest way to perform sorting is to assign a value to the
IndexFieldName value of the underlying DataSet. This can be done in the OnTitleClick
event of the TDBGrid (note that you must also set the dgTitleClick option in order to
enable this event in the first place).
So, the first implementation of OnTitleClick is as follows, using the column title names as
fieldnames (which works almost for all columns):
procedure TFormClient.DBGridReportsTitleClick(Column: TColumn);
begin
if DBGridReports.DataSource = DataMod.DataModule1.dsReports then
begin
if Column.Title.Caption = 'Reporter' then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames := 'ReporterID'
else
if Column.Title.Caption = 'Assigned' then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames := 'AssignedTo'
else
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.Title.Caption
end
else // dsIssues
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.Title.Caption
end;
Note that for the read-only dsIssues, the column titles can be used as IndexFieldNames,
since all fields are straight data fields. However, for the dsReport, we returned the
ReporterID and AssignedTo fields, and used them in combination with the cdsUserNames
table to create two lookup fields for the Reporter and Assigned names.
Unfortunately, we cannot use lookup fields as IndexFieldNames, so in those cases (for the
Reporter and Assigned columns) we must fall back to the original ReporterID and
AssignedTo fields. This will not actually sort them by name (rather by ID), but at least the
entries will be grouped together, which is sometimes handy as well.
However, these sort orders are ascending only. From small to big. In order to allow
descending sorting (from the highest priority to the lowest, with the highest on top), we
need another approach. The TClientDataSet cannot use an IndexFieldName to sort in
descending order. However, we can define a whole new index (at run-time or design time)
for this purpose.
In the Data Module, select the cdsReports and click on the ellipsis for the IndexDefs
property to start the editor for the index definitions. We can add 10 new definitions, call
them cdsReportsIndexProjectDesc, cdsReportsIndexVersionDesc, etc. (see screenshot).
For each index, we can specify the Fields (like Project for the cdsReportsIndexProjecDesc)
and have to check the ixDescending and ixCaseInsensitive option flags. This way, we can
define 10 indices that sort by a column in descending order.
Now, the code to sort a column can be using the IndexFieldName property when we want
to sort in ascending order, or the IndexName if we want to use the descending order.
Setting one property will clear the other, so we don’t have to worry about that. Since the
index names are cdsReportsIndex + fieldname + Desc, we can now code the OnTitleClick
as follows:
procedure TFormClient.DBGridReportsTitleClick(Column: TColumn);
begin
if DBGridReports.DataSource = DataMod.DataModule1.dsReports then
begin
if Column.Title.Caption = 'Reporter' then
begin
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
'ReporterID' then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsReportsIndexReporterDesc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
'ReporterID'
end
else
if Column.Title.Caption = 'Assigned' then
begin
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
'AssignedTo' then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsReportsIndexAssignedDesc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames := 'AssignedTo'
end
else
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
Column.Title.Caption then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsReportsIndex' + Column.Title.Caption + 'Desc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.Title.Caption
end
else // dsIssues
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
Column.Title.Caption then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsIssuesIndex' + Column.Title.Caption + 'Desc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.Title.Caption
end;
The result is that we can click on any column to sort the grid on that field, and click again
to reverse the sort order. We could optionally set the “bold” option for the font of the
sorted column, but I leave that as exercise for the reader.
Note that using the Title.Caption may be a good idea as long as we do not rename the title
captions. However, if you want to include the ReportID field in the grid as well, but want to
change the Title.Caption to # for example, then this will no longer match the fieldname,
and as a result assigning the Title.Caption to the IndexFieldName will raise an exception.
In that case, and in general to be honest, it’s better not to use the Column.Title.Caption
but rather the Column.FieldName. This should always contain the correct fieldname of
course, while the caption of the title can be modified at will.
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
Column.FieldName then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsReportsIndex' + Column.FieldName + 'Desc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.FieldName
end
else // dsIssues
if (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames =
Column.FieldName then
(DBGridReports.DataSource.DataSet as TClientDataSet).IndexName :=
'cdsIssuesIndex' + Column.FieldName + 'Desc'
else (DBGridReports.DataSource.DataSet as TClientDataSet).IndexFieldNames :=
Column.FieldName
end;
Coloring the TDBGrid involves implementing the OnDrawColumnCell event handler, which
can be done as follows:
procedure TFormClient.DBGridReportsDrawColumnCell(Sender: TObject;
const Rect: TRect; DataCol: Integer; Column: TColumn; State: TGridDrawState);
const
Color: Array[0..10] of Integer = ($E0E0E0, $FBFBE0,$EDF9E0,$E0F7E0,$E0F9ED,
$E0FBFB,$D0EDFD, $C0E0FF,$D0E0FD, $E0E0FB,$E0E0FF);
var
Str: String;
begin
if (Column.Field.FieldName = 'IssueType') or (Column.Field.FieldName = 'Priority') or
(Column.Field.FieldName = 'Status') then
begin
if (Column.Field.FieldName = 'Status') then
(Sender as TDBGrid).Canvas.Brush.Color := Color[10-Column.Field.AsInteger]
else
(Sender as TDBGrid).Canvas.Brush.Color := Color[Column.Field.AsInteger];
(Sender as TDBGrid).Canvas.Font.Color := clBlack;
(Sender as TDBGrid).Canvas.FillRect(Rect);
A final extension of the grid involves placing a hint value for the current record. Remember
how the summary field is only partially visible, and the report field itself not at all. So it’s
not easy to determine the actual issue that was reported, without looking at the details.
However, we can set the Hint property of the TDBGrid by implementing the OnDataChange
event handlers of the two TDAtaSources dsIssues and dsReports as follows:
procedure TDataModule1.dsIssuesDataChange(Sender: TObject; Field: TField);
begin
try
ClientForm.FormClient.DBGridReports.Hint :=
cdsIssuesSummary.AsString
except
end
end;
Note that the data module is now referring back to the ClientForm global variable.
Email Progress
Another useful feature to allow people to get notified when their reported issues are being
worked on (or when new reported issues have been assigned to them), is the sending of
an e-mail to the e-mail address of the user associated with the ReporterID, as well as the
AssignedTo (if present).
The sending of the e-mail can be done in two places: either at the client side, for example
in the OnAfterPostOrDelete of the cdsReports master dataset (which is the only place
where updates to the cdsReports (or cdsComments) are being sent back to the server), or
at the server side where it’s easy to send e-mails.
The ideal solution that I’ve implemented is actually a mix of the client and server benefits:
creating the e-mail message at the client side, and then forwarding it to the server by
calling a new server method SendEmail which will send the e-mail (from the server side).
The server-side implementation of the SendEmail is as follows, using another .ini file to
store the e-mail configuration details, and receiving the From, Recipient, Subject and Msg
parameters:
function TServerMethods1.SendEmail(const From, Recipient, Subject,
Msg: String): Boolean;
var
SMTP: TidSMTP;
MailMessage: TidMessage;
begin
Result := False;
SMTP := TidSMTP.Create(nil);
try
with TIniFile.Create('C:\inetpub\Email.ini') do
try
SMTP.Host := ReadString('SMTP','Host', '');
SMTP.Port := ReadInteger('SMTP','Port',25);
SMTP.Username := ReadString('SMTP','Username', '');
SMTP.Password := ReadString('SMTP','Password', '');
finally
Free
end;
MailMessage := TidMessage.Create;
try
// setup message
MailMessage.From.Address := From;
MailMessage.BccList.EMailAddresses := From;
MailMessage.Recipients.EMailAddresses := Recipient;
MailMessage.Subject := Subject;
MailMessage.Body.Text := Msg;
try
CodeSite.Send('Sending e-mail');
try
SMTP.Connect;
SMTP.Send(MailMessage);
Result := True
except
on E: Exception do
CodeSite.SendException('SMTP', E)
end
finally
if SMTP.Connected then SMTP.Disconnect
end
finally
MailMessage.Free
end
finally
SMTP.Free
end
end;
Note that the Email.ini file is read for each e-mail that we want to send, so it’s easy to
configure them in case of a connectivity problem, without having to restart the server. Of
course, if you never encounter e-mail problems, you can also cache them in your
application and only have to read them once. I leave that as exercise for the reader.
At the client side, we now need to extend the event handler for the OnAfterPost and
OnAfterDelete of the cdsReports. First, we should determine the Recipient, based on the
ReporterID and AssignedTo fields from the current report.
Then, we can fill a TStringList with the values from the fields of the reported issue,
followed by all comments that belong to this issue (ordered by date – with the last
comment last).
procedure TDataModule1.cdsReportsOrCommentsAfterPostOrDelete(DataSet: TDataSet);
var
ReportID: Integer;
From,Recipient,Subject: String;
Msg: TStringList;
Server: TServerMethods1Client;
begin
Msg := TStringList.Create;
try
From := '[email protected]';
if cdsUserNames.Locate('UserID', cdsReportsReporterID.AsInteger, []) then
Recipient := cdsUserNamesEmail.AsString;
if (not cdsReportsAssignedTo.IsNull) and
(cdsReportsReporterID.AsInteger <> cdsReportsAssignedTo.AsInteger) then
if cdsUserNames.Locate('UserID', cdsReportsAssignedTo.AsInteger, []) then
Recipient := Recipient + ',' + cdsUserNamesEmail.AsString;
Subject := 'DIRT Client Issue #' + cdsReportsReportID.AsString;
Msg.Add(' Project: ' + cdsReportsProject.AsString);
Msg.Add(' Version: ' + cdsReportsVersion.AsString);
Msg.Add(' Module: ' + cdsReportsModule.AsString);
Msg.Add('Reporter: ' + cdsReportsReporterName.AsString);
Msg.Add('Assigned: ' + cdsReportsAssignedName.AsString);
Msg.Add('IssueTyp: ' + cdsReportsIssueType.AsString);
Msg.Add('Priority: ' + cdsReportsPriority.AsString);
case cdsReportsStatus.AsInteger of
1: Msg.Add(' Status: Reported ' + cdsReportsStatus.AsString);
2: Msg.Add(' Status: Assigned ' + cdsReportsStatus.AsString);
3: Msg.Add(' Status: Opened ' + cdsReportsStatus.AsString);
6: Msg.Add(' Status: Solved ' + cdsReportsStatus.AsString);
7: Msg.Add(' Status: Tested ' + cdsReportsStatus.AsString);
8: Msg.Add(' Status: Deployed ' + cdsReportsStatus.AsString);
10: Msg.Add(' Status: Closed ' + cdsReportsStatus.AsString);
end;
Msg.Add('Rep.Date: ' + cdsReportsReportDate.AsString);
Msg.Add(' Summary: ' + cdsReportsSummary.AsString);
Msg.Add(' Report:');
Msg.Add(cdsReportsReport.AsString);
Msg.Add('----------------------------------------------------------------');
cdsComments.First;
while not cdsComments.Eof do
begin
if cdsCommentsCommentID.AsInteger > 0 then
begin
Msg.Add('');
Msg.Add('Comment User: ' + cdsCommentsUserName.AsString);
Msg.Add('Comment Date: ' + cdsCommentsCommentDate.AsString);
Msg.Add('Comment:');
Msg.Add(cdsCommentsComment.AsString);
Msg.Add('----------------------------------------------------------------');
end;
cdsComments.Next;
end;
// now get the last comment, which may have key value of -1
cdsComments.First;
if (not cdsComments.Eof) and (cdsCommentsCommentID.AsInteger < 0) then
begin
Msg.Add('');
Msg.Add('Comment User: ' + cdsCommentsUserName.AsString);
Msg.Add('Comment Date: ' + cdsCommentsCommentDate.AsString);
Msg.Add('Comment:');
Msg.Add(cdsCommentsComment.AsString);
end;
if cdsReports.ApplyUpdates(-1) = 0 then
begin
Server := TServerMethods1Client.Create(DataMod.DataModule1.SQLConnection1.DBXConnection);
try
Server.SendEmail(From,Recipient,Subject,Msg.Text)
finally
Server.Free
end;
ReportID := cdsReportsReportID.AsInteger;
cdsReports.Refresh;
if ReportID > 0 then
cdsReports.Locate('ReportID', ReportID, [])
end
finally
Msg.Free
end
end;
We need two loops to retrieve the comments: first all comments with a known, already
submitted primary key value, and then the new comment with the negative primary key
value that is not yet submitted (because it is an autoincrement field at the server).
Note that the final part of the event handler (on this page) also involves the repositioning
of the current record in the cdsReports. In case of a new record, the call to Refresh will
return a new key value for the Report record (which was -1 when we started to apply the
updates), but in that case we should go to the last record of the dataset.
These methods can be executed by anyone, there is no special permission required. Any
role will do (of course, you still need to login to the DataSnap server itself, but once logged
it, anyone can call the ClientVersion to see what the latest version of the client it, can call
the ClientStream to get a TStream with the latest client executable in it, and can call the
ClientStreamSize function to get the expected filesize in bytes, in order to show a little
progress dialog on screen.
The management of the information about the client is done with help of a .ini file, with
the following contents:
[DirtCleaner]
Major=1
Minor=14
DirtCleaner.exe=C:\Inetpub\DirtCleaner.exe
We could also place the file size inside, and we could dynamically obtain the version
information from the executable, but I wanted to demonstrate both techniques.
The implementation of the ClientVersion just consists of returning the Major value
multiplied by 1000 plus the Mino value, implemented as follows:
function TServerMethods1.ClientVersion: Integer;
begin
with TIniFile.Create('C:\inetpub\AutomaticUpdates.ini') do
try
Result := 1000 * ReadInteger('DirtCleaner', 'Major', 0) +
ReadInteger('DirtCleaner', 'Minor', 0)
finally
Free
end;
end;
The most interesting method is the one returning the TStream. This time, we need to
obtain the name of the executable (like we did for the ClientStreamSize method), and
open it as file stream, but we should not close if or free it at the server side. This certainly
feels “strange”, but we pass the stream on to the client, so we cannot close it (like the
server methods that return a TDataSet, where we should also not close or free the
TDataSet at the end of the server method).
To cut a long story short, the implementation of the ClientStream is as follows:
function TServerMethods1.ClientStream: TStream;
var
Update: String;
begin
Result := nil;
Update := '';
with TIniFile.Create('C:\inetpub\AutomaticUpdates.ini') do
try
Update := ReadString('DirtCleaner', 'DirtCleaner.exe', '');
if Update <> '' then
Result := TFileStream.Create(Update, fmOpenRead or fmShareDenyWrite)
finally
Free
end;
end;
By opening the TFileStream with the flags fmOpenRead and fmShareDenyWrite we ensure
that other people can also get the same stream, but it’s not possible to update (write to)
the executable while others are downloading the current version. This is an extra check to
avoid file corruption.
As a little tip: if the file is locked and you want to release a new version, you can always
place an executable with the version number in the same, and point the .ini file to that
executable (the filename is only used to open the stream, and not used as name at the
client side). This will also enable you to keep a list of all different versions of the client
executable at the server, so you can rollback to an earlier version if you should encounter
problems with more recent versions.
At the client side, we also need a function to retrieve the version information, which is
implemented as follows:
function VersionMajorMinor: Integer;
var
VerInfoSize: Cardinal;
Dummy: Cardinal;
PVerInfo: Pointer;
PVerValue: PVSFixedFileInfo;
begin
Result := 0;
VerInfoSize := GetFileVersionInfoSize(PChar(ParamStr(0)), Dummy);
GetMem(PVerInfo, VerInfoSize);
try
if GetFileVersionInfo(PChar(ParamStr(0)), 0, VerInfoSize, PVerInfo) then
if VerQueryValue(PVerInfo, '\', Pointer(PVerValue), Dummy) then
with PVerValue^ do
Result := 1000 * HiWord(dwFileVersionMS) + LoWord(dwFileVersionMS)
finally
FreeMem(PVerInfo, VerInfoSize);
end;
end;
For an executable with version 1.14.0.63 the result of this function is 1014. This value can
be used to compare to the value that the ClientVersion server methods returns.
This form is created and shown by the main form right after the Login to the DataSnap
Server succeeded, and if the client version is lower than the ClientVersion return value of
the DataSnap Server.
The main trick, replacing the executable by a newer version, is done inside the Yes
button’s OnClick event handler. We start by renaming the current executable with the .old
extension (removing a previous .old file if present), and then open the TStream result
from the ClientStream function, reading chunks of 32 KB using a buffer until the entire
stream has been copied. The TProgessBar is set to the ClientStreamSize to give a nice
progress overview.
procedure TFormUpdate.BitBtn1Click(Sender: TObject);
const
iBufSize = 32 * 1024;
var
Stream: TStream;
FileStream: TFileStream;
Buffer: PByte;
iReadCnt: Integer;
begin
lbInfo.Visible := False;
DeleteFile(ParamStr(0)+'.old');
RenameFile(ParamStr(0), ParamStr(0)+'.old');
Screen.Cursor := crHourGlass;
try
ProgressBar1.Align := alClient;
ProgressBar1.Max := DataSnapServer.ClientStreamSize;
Caption := 'Downloading ' + IntToStr(ProgressBar1.Max) + ' bytes...';
ProgressBar1.Min := 0;
Stream := DataSnapServer.ClientStream;
Application.ProcessMessages;
FileStream := TFileStream.Create(ParamStr(0), fmCreate);
Stream.Position := 0;
try
GetMem(Buffer, iBufSize);
try
repeat
iReadCnt := Stream.Read(Pointer(Buffer)^, iBufSize);
ProgressBar1.Position := ProgressBar1.Position + iReadCnt;
if iReadCnt > 0 then FileStream.WriteBuffer(Pointer(Buffer)^, iReadCnt);
until iReadCnt < iBufSize;
finally
FreeMem(Buffer, iBufSize);
end;
finally
Stream.Position := 0;
FreeAndNil(FileStream);
FreeAndNil(Stream);
end;
finally
Screen.Cursor := crDefault;
end
end;
The download of an upgrade can take a few seconds, sometimes up to half a minute if you
have a slow connection, but the dialog turns into a progress overview while the new
version is downloaded:
If a new version is found and downloaded (by the TFormupdate), then the current
application is terminated by calling Application.Terminate, but since this only sends a
WM_QUIT message to the application, so there is still time to perform a WinExec of
ParamStr(0), which is the updated version of the executable on disk. This is implemented
as follows:
if (Server.ClientVersion > VersionMajorMinor) then
begin
with TFormUpdate.Create(Self) do
try
DataSnapServer := Server;
if ShowModal = mrYes then
begin
This will ensure that after each login, a check is done to see if a newer version of the DIRT
Client exists at the server, and if so, offer the user the chance to download the newer
version.
Client Deployment
If we add the MidasLib unit to the uses clause of the DataSnap Client project, then we end
up with an almost stand-alone executable. No database drivers are needed. However,
since the ISAPI DLL uses HTTPS, and the TCP/IP stand-alone server uses the RSA and PC1
filters, we must also deploy the libeay32.dll and ssleay32.dll files with our DataSnap
Client– or make sure they already exist at the client machine.
The DirtCleaner executable and two SSL support DLLs require an internet connection to
connect to the DirtServer, but apart from that, nothing else is needed. A valid user name
and password is all that’s needed to connect to the server and participate in the role of
Manager, Developer or Tester. The source code for the DirtServer and DirtCleaner projects
is included in the source code archive for this courseware manual, and for readers of the
electronic PDF file the source code can also be found in the SVN repository at
https://www.bobswart.nl:8443/svn/DataSnap/DIRT
SVN Repository
To get the latest version of the project, just start Delphi XE2, and do File | Open from
Version Control, and then specify https://www.bobswart.nl:8443/svn/DataSnap/DIRT as
the URL of the Repository, as well as a local directory to copy these files.
You may get a Subversion Login dialog, where you need to specify the username
Courseware with password Delphi4Ever. These credentials will give you read-only
access to the https://www.bobswart.nl:8443/svn/DataSnap/DIRT repository (you may also
get a dialog about a problem with the certificate, because it’s a self-signed certificate at
port 8443).
Save the Authentication information, so the Delphi XE IDE will remember it when you want
to get updates of the files (which will be available when I commit changes to the projects
in the future).
If you uncheck the Save authentication option, you will be prompted with a Subversion
Login dialog at regular times when updating files from the projects.
After all files have been copied, you get a choice which project to open: DirtCleaner.dproj
or DirtServer.dproj, or the entire project group _DIRT.groupproj. Obviously the later is the
best choice, so you get both projects:
I will make regular updates to the DirtServer and DirtClieaner projects, although you will
only be able to get read-only access. If you find bugs or problems, you can report them by
pointing your DirtCleaner to my real DirtServer deployed on https://www.bobswart.nl (the
default HostName specified in the SQLConnection component on the DataMod.pas unit).
The login name that you can use for DIRT issues is Courseware with password
Delphi4Ever again (same combination as for the SVN access).
You will only be able to see issues reported by or for the DIRT tester, although the project
names and modules of other customers will also be visible (but not the actual reports).
You will also not get any e-mail notifications, but I will look at the reported issues and
wishes, of course.
Summary
In this section I’ve described how to design and implement a small but real-world secure
DataSnap Server application, and deployed it on Microsoft’s Internet Information Services
as an ISAPI DLL. Security was implemented using authentication and authorization as well
as HTTPS (or RSA/PC1 filters for a stand-alone server) for a secure transport channel. I
also showed how to connect to the DataSnap Server from a DataSnap Client in order to
call server methods and work with exported DataSetProviders.
Among the technical issues covered when working with datasets, I explained how to work
with autoincrement fields, especially in combination with master-detail and nested
datasets.
This application would not have been possible without the many new features and
enhancements in found in DataSnap XE. With the release of Delphi XE, DataSnap was
again substantially expanded and enhanced compared to DataSnap 2010 or 2009, and a
lot improved since the COM-based original versions of DataSnap and MIDAS.
References
RAD Studio in Action – DataSnap XE white paper
http://www.embarcadero-info.com/in_action/radstudio/db.html