SharePoint 2010 & Host Integration Server 2010 Technical Article
Writer: Jon Fancey
Technical Reviewer: Steve Smaller
Published: May 2012
Applies to: SharePoint 2010 & Host Integration Server 2010
Summary: Microsoft Office SharePoint® Server 2010 enables enterprise IT groups to facilitate collaboration, provide content management, implement business processes, and supply access to information. A large percentage of enterprises have investments in existing IBM programs, messages and data upon which they rely for mission-critical operations. Microsoft Host Integration Server 2010 offers technologies that assist IT groups in integrating their existing IBM systems with new solutions based on SharePoint Server. In this paper, we will show how to apply the new HIS 2010 features, such as
Transaction Integrator, OLE DB Provider for DB2, Host File provider and the WCF Channel for
WebSphere MQ to create customized SharePoint solutions. Along the way, we will compare SharePoint
Server to Moss where necessary.
This document is provided “as-is”. Information and views expressed in this document, including URL and other Internet Web site references, may change without notice. You bear the risk of using it.
This document does not provide you with any legal rights to any intellectual property in any Microsoft product. You may copy and use this document for your internal, reference purposes.
© 2012 Microsoft. All rights reserved.
2
3
4
Back in 2010 I wrote a paper 1 showing how to integrate IBM systems and technologies with Microsoft
Office SharePoint Server 2007 (MOSS) using Host Integration Server 2009. Although this paper is now only two years old, much has changed. With SharePoint 2010, SharePoint has evolved into a rich platform on which to build all kinds of applications. I find modern web-based applications most compelling when examined through the SharePoint lens as SharePoint 2010 makes it easy to create
composite applications: applications assembled from prebuilt or existing components. Composite
SharePoint-based applications are compelling for several reasons:
Non-developers can build such applications using "power tools" like SharePoint Designer (SPD).
Components that plug into SharePoint can use the capabilities of the underlying platform that would be too complex or time-consuming for a developer to build from scratch.
A component-based approach to the Web offers flexibility in a heterogeneous environment.
Because most enterprise environments include technologies from multiple vendors, developers need the ability to easily and flexibly integrate these technologies. If components are easy to integrate with existing technologies, developers can more easily create web-based applications that both consume data and drive business logic in external systems across platforms.
The good news is that SharePoint 2010 in conjunction with the latest version of Host Integration Server
2010 provides many capabilities to empower developers in this way when used. If you’ve looked at my previous paper on this topic, you should be struck by how much easier it is to create integration solutions in SharePoint 2010. I hope that’s motivation enough to dig in.
This paper covers three aspects of integration with IBM technologies using SharePoint:
Data integration: how to bring data into SharePoint applications from DB2 and host files
Process integration: how to drive external applications from workflows by calling host programs
Messaging: how to use WebSphere MQ to build loosely-coupled solutions with SharePoint front and center.
This paper shows a progression of underlying techniques, starting off by looking at what’s possible by just using tools with no code and then examining more powerful techniques that use SharePoint 2010's new features with Visual Studio 2010 to provide greater control and more capabilities in your solutions.
To support building composite applications, the paper discusses how to build reusable components that others can employ in their own applications. All of the examples covered are complete and designed as a series of steps so that you can try them for yourself as well.
1 http://download.microsoft.com/download/1/1/2/112BB366-C14B-4DA4-A337-
67613720BF48/HIS2009_MOSS.pdf
5
An important emerging theme in enterprise applications over the last few years is data, lots of it. The volume of information many organizations must manage is growing more rapidly than ever, creating new opportunities to gather, manipulate, and present that data accessibly. SharePoint provides many facilities in this area, and this paper looks at the key capabilities.
The SharePoint 2010 feature that provides many of these new capabilities is Business Connectivity
Services (BCS), the successor to the Business Data Catalog (BDC) in MOSS 2007. Table 1 summarizes the most significant new capabilities that BCS adds to SharePoint 2010.
Table 1. BCS Features.
Feature
Availability
Models
.NET connectivity assemblies
Custom connectors
Integration with SharePoint Designer
Offline data
External lists and content types
Web parts
Capability
BCS is available in all versions of
SharePoint. Previously only MOSS
2007 provided access to the BDC
You can now create models in Visual
Studio to extend BCS connectivity
You can now create your own connectors to line-of-business systems
You can create custom connectors that provide full control over integration and allow more flexible data type mapping and dynamic data types
New tools are provided to make it easier to create BCS definitions such as external content types
A framework is provided that can take data from BCS into client applications such as Outlook, update the data locally (even if disconnected), and re-sync
A simple process is available to create lists based on BCS definitions and external content types
(formerly termed entities)
New web parts are available in
SharePoint Server
Before we look at BCS in more detail, let’s look at a simple example of data integration in SharePoint.
The Data View Web Part (DVWP) is a SharePoint feature that makes it easy to surface data.. DVWP, a
6
feature of SharePoint since WSS 3.0/MOSS 2007, allows data retrieval from sources that are defined in
SharePoint Designer (SPD). To use DVWP, you must:
1.
Create a data source
2.
Create and configure an instance of the DVWP on a web page.
Consuming SQL Server is the optimized case, but DVWP can work with any OLEDB- or ODBC-based data provider. Because Host Integration Server 2010 ships with OLE DB providers for both DB2 and Host
Files, configuration is a straightforward process.
SharePoint Designer can be used to create and manage data sources. To create a data source:
1.
Launch SharePoint Designer.
Under the Site Objects pane, the Data Sources object enumerates all data sources. These data sources automatically include all lists and libraries in the site that you have open. The following image shows the Data Sources contextual ribbon that appears.
Figure 1, Data Source Ribbon.
The New tab on the ribbon shows different data source types. You can retrieve data from a database (or any other OLEDB- or ODBC-based data source), a SOAP-based web service, a RESTbased service, or even a static XML file. You can also create a linked data source to connect two data sources together. (This action is similar to a database join.) Connecting data sources is useful because it allows parent/child relationships to be created and configured in DVWP.
2.
Click Database Connection to create a new database connection.
The following dialog appears:
7
Figure 2, New Data Source Dialog.
3.
Click the Configure Database Connection button.
The Configure Database Connection dialog appears:
Figure 3, New Data Source Configure Database Dialog.
You use this dialog to specify configuration details for the data source.
4.
The Provider Name dropdown has two options: SQL Server and OLE DB. This example uses the
Microsoft DB2 OLE DB provider, but you need more fine-grained control over the configuration.
5.
For more control, check the Use a custom connection string option. Here you can specify the full connection string for our data source. The following section describes the procedure for creating the connection string.
Creating the Connection String
Use the HIS Data Access Tool (DAT) to create the connection string.
1.
Open the Start menu, and click the Microsoft Host Integration Server 2010 program folder.
2.
Under Data Sources, right-click DB2 OLE DB UDLs and select New Data Source...
8
3.
A wizard to create the connection string opens. This example uses DB2 Express-C 9.7 on
Windows 7, which is free to download and use.as the data source. (To use a different version on another platform you will need to adjust some of these settings.) a.
Select DB2/NT for the Data Source Platform, leave the Network Transport Library as
TCP/IP, and click Next. b.
For the address, enter the IP address of the DB2 Server. In this example, the server is installed on a local machine so the address should be 127.0.0.1 (localhost) . Click
Next. c.
Enter SAMPLE for Initial Catalog, and enter NULLID for the Package Collection. Leave all other fields blank and click Next. d.
Accept the defaults on the next dialog and click Next. e.
On the Security dialog enter the DB2 user id and password that you set up when you installed DB2, and check the Save Password box.
This approach is the easiest to follow during development. However, you should strongly consider using Kerberos or integrated security to avoid storing sensitive data in this way. f.
Click Next, and then click Next again to accept the defaults on the Advanced Options dialog. g.
Review the property values, and then click Next. h.
Test your connection by clicking the Connect button. If the connection test succeeds, click Next, give the data source the name SharePointSamples , and check the
Universal Data Link option. i.
Click Next and then Finish to finish creating the connection string.
A connection string similar to the following string appears In the window at the bottom of the DAT:
Provider=DB2OLEDB;Password=<password>;User ID=<userid>;Initial
Catalog=SAMPLE;Data Source=127.0.0.1;APPC Mode
Name=QPCSUPP;Network Transport Library=TCPIP;Host CCSID=37;PC
Code Page=1252;Network Address=127.0.0.1;Network
Port=50000;Package Collection=NULLID;DBMS Platform=DB2/NT;Process
Binary as Character=False;Units of Work=RUW;
Analyzing the Connection String and Securing Connection Data
Let’s dissect this connection string, as much as of it is the same for all DB2/SharePoint 2010 integration scenarios. First, the connection string includes the user id and password. DB2 needs these elements in order to connect. The Use Single Sign-On authentication option appears in the Configure Database
Connection dialog as described in step 3 of the previous procedure, , but this option is deprecated in
SharePoint 2010 because the Secure Store Service (SSS) replaces it, and DVWP does not yet support SSS.
For these reasons you must include the details in the connection string.
When you create a data source using this process, the details that you provide are stored in an XML file and uploaded to SharePoint. The file is kept in a hidden document library named spdatasources.
9
Because anyone else that has access to the site also has access to these details, a security risk could result, and SharePoint Designer displays a security warning. Thus, you should not use this procedure to create the data source if having the connection details in the XML file represents an actual security problem. Instead, as mentioned, you can configure DB2 to use Kerberos 2 or use Windows authentication. Such a configuration removes the need to store the credentials in an insecure file or to specify and transmit user IDs and passwords between servers.
The remainder of the connection details that you provided also contain data to establish the connection between DB2 and SharePoint 2010:
The location of the server that hosts the database. The Data Source property specifies this information.
The port on the server to use for connections. The Network Port property provides this information.
The database name. The Initial Catalog property provides this information.
Specifying the Network Transport, Code Pages, and Platform
Because DB2 is a multi-platform database, it also needs data to specify the network transport to use, the required code pages, and the platform itself. Use the following procedure:
1.
Go back to step 5 in the earlier section "Creating a Data Source." After you check the Use a
Custom connection string option, click Edit.
2.
Copy the connection string from step 3i in "Creating the Connection String," and paste it into the text box.
3.
Click Ok and then Next to connect to the database
The following dialog appears:
2 http://www.ibm.com/developerworks/data/library/techarticle/dm-0603see/index.html
10
Figure 4, Configure Database Connection.
The dialog enumerates the databases that are accessible through the connection and the tables they contain.
4.
From the Database dropdown click SAMPLE to select the database that is provided with DB2.
5.
Select or specify custom Select… option to specify SQL statements directly.
This option is necessary because SPD does not automatically generate the correct SQL statements for DB2.
6.
Click Finish.
The Edit Custom SQL Commands dialog appears:
11
Figure 5, Edit Custom SQL.
Select the Stored Procedure option, and click OK. (Don’t worry that you cannot see a list of stored procedures at this point.)
7.
Click the General, tab and enter the name Sales Data from DB2 for the data source.
8.
Click OK to create the data source.
Viewing the XML Data Source Definition
As mentioned in the earlier section "Analyzing the Connection String and Securing Connection Data," the definition for the data source that you just created is stored in an XML file in a hidden SharePoint library.
Open this file using the following procedure:
1.
Open SharePoint Designer (SPD), and click All Files in the Site Objects pane.
2.
In the list that appears, click _catalogs.
3.
Click fpdatasources to see the data source you just created. Click the data source to select it, and then click Edit File in the ribbon.
4.
Find the attribute SelectCommandType="StoredProcedure" and add the following attribute after it, substituting your own prefix for the SCHEMA prefix. If you installed DB2 on your machine, your prefix is your username by default (for example, JON.GETSALES).
SelectCommand="SCHEMA.GETSALES"
5.
Add the following to the start of the ConnectionString attribute:
Provider=DB2OLEDB;
6.
Save the file and close it.
12
The complete XML file appears in Appendix A of this document.
Creating the Stored Procedure
Now you must create the stored procedure just referenced above so that you can invoke it. Use this procedure:
1.
Open Start menu, and click DB2 Control Center from the Start button.
2.
Expand the All Databases node in the tree view, then right-click SAMPLE and select Query…
3.
In the query window, enter the following SQL statements:
CREATE PROCEDURE GETSALES()
DYNAMIC RESULT SETS 1
LANGUAGE SQL
BEGIN
DECLARE CS1 CURSOR WITH HOLD WITH RETURN TO CLIENT
FOR
SELECT SALES_PERSON, REGION, SALES, CHAR(SALES_DATE) AS SALES_DATE
FROM SALES;
OPEN CS1;
END!
4.
Ensure that the “;” character is specified as the Statement Termination Character at the bottom of the window, and then click the Execute button to create the stored procedure.
By default the stored procedure is created under your user name as the schema. (This is why you prefixed the SELECT command in step 4 under "Viewing the XML Data Source Definition" with your username.)
Placing Data on the SharePoint Page
Now that the data source is created, you can place the data on a page in SharePoint. To do this, you need to create a new Web part page and add the DVWP to it. The easiest approach is to use the following procedure:
1.
In the navigation pane in SharePoint Designer click the Site Pages object.
2.
In the ribbon, click the Web Part page.
The following screen appears:
13
Figure 6 . Web Part Page Layouts.
3.
Choose the first layout, full-page vertical.
Clicking a layout adds a new page, which is named SalesData.aspx in this example.
4.
To add the Data View to the page, select the new SalesData.aspx file and click Edit File in the ribbon to open it.
5.
Click in the web part zone rectangle on the page as shown in the following screen:
14
Figure 7, Add Data View Web Part.
6.
Click the Insert tab on the ribbon, and select Data View.
This command shows a dropdown of all available data sources available, including the one just created.
7.
Select Sales Data from DB2 to add the web part to the page and display data from the Sales table.
8.
Click the Save icon above the menu bar.
9.
Go back to the Site Pages view in the Site Object pane, right-click the page you just created, and select Preview in Browser as shown in the following illustration:
Figure 8, Preview Page.
This command opens the page in your browser so that you can verify that everything is working as expected. You should now see a page similar to the following screen:
15
Figure 9, Finished DVWP Web Page.
Summary: Web Page Data Integration with Data View Web Part (DVWP)
As the previous sections have shown, the Data View Web Page (DVWP) in SharePoint provides a simple way to integrate data onto a web page. DVWP offers additional capabilities: it enables aggregation using links to merge data from multiple sources in the results. DVWP also allows you to specify XSLT to control the presentation of the web part and format the results so that you can sort and group items on the pageInserts, updates and deletions are supported as well allowing you to create forms to format the data.
As you’ve also seen in previous sections, DVWP is not without some limitations, particularly around security. Another limitation is that some manual work is needed to make DVWP work with DB2: you must perform this work carefully because changes are lost if the data source is later updated in the designer. Perhaps more fundamentally, the DVWP doesn’t integrate external data with SharePoint; it simply fetches and displays it. This distinction is important because by integrating external data more completely with SharePoint you can take advantage of many other capabilities of the platform such as
Search that can act on your data. The next section describes these additional capabilities.
As I’ve mentioned already, Business Connectivity Services (BCS) are pivotal to all external data access in
SharePoint 2010, so this section examines its capabilities in more detail before discussing how to use them to take advantage of IBM assets in your organization.
Much about BCS remains the same as it was in Business Data Catalog (BDC) in earlier versions of
SharePoint. BDC continues to maintain the model and data definition repository in SharePoint 2010, with new capabilities. MOSS 2007 had three key limitations to adoption and ease of use:
Provision for just databases and SOAP-based web services
Optimization only for read-only scenarios
Lack of tooling
Microsoft Office SharePoint Server 2007 (MOSS) could only access database-based data or web services due to the lack of extensibility in BDC. As a result, direct access to line-of-business systems and consumption of REST-based web services was hard, if not impossible.
Because the database type in the BDC included ODBC, ADO.NET, and OLE DB, you had access to any data source if you had a provider for it. However, limited tooling in the BDC was a blocker to widespread use.
The improvements in BCS include vastly improved tooling support for both developers using Visual
Studio 2010 and power users/administrators using Microsoft Office SharePoint Designer (SPD).With this improved tooling, the need to understand the underlying XML file format of the BDC has largely gone away.
16
The third limitation of the BDC was its focus on read-only scenarios. While you could create SharePoint applications that supported a full range of CRUDQ (create, read, update, delete, query) operations, it wasn’t easy and often resulted in large amounts of code being necessary in order to use the BDC API directly.
As described in the earlier section "What We'll Cover," the remainder of this paper is split into three parts: data, applications and messaging.
Under "data" the paper looks at DB2 and Host File access and how to bring data from these sources into your SharePoint applications.
Under "applications" the paper shows how to call host programs from SharePoint.
Under "messaging" the paper describes how to use WebSphere MQ to help build flexible, loosely-coupled applications that use the features provided in Host Integration Server 2010.
Before reviewing these areas, it’s time to examine in detail the lynchpin in SharePoint’s integration capabilities: Business Connectivity Services.
BCS is implemented as a Service Application in SharePoint 2010. Service applications provide finegrained configuration, control and granularity over which servers in a farm they run on. This feature provides more hosting flexibility than was possible in 2007 as it allows multiple instances of a service application to be created and configured independently (for example, each instance could have different security permissions).
The core construct in BCS is the Line of Business (LOB) system. This is an encapsulation of an external system with the necessary connection details specified. An LOB system consists of one or more entities.
An entity models an external resource such as a database table or a web service operation. Like objects in object-oriented programming, entities have methods defined that specify their behavior. However, these methods are stereotyped from a built-in set of methods defined by BCS itself that mirror a complete set of CRUDQ operations, with the most common shown in Table 2. One important additional point is that each type in the BCS, LOB, entity or method must have one or more instances defined, in the same way a class must have one or more objects defined to perform any actions.
Table 2. BCS Stereotypes.
Method
Finder
Specific Finder
Updater
Deleter
Inserter
Purpose
Retrieves a list of items from the data source
Retrieves a specific item from the data source
Updates a specific item in the data source
Deletes an item from the data source
Adds an item to the data source
Figure 9 shows the key components of BCS. In SharePoint 2010 an entity, its instances, and its methods
(and their instances) are grouped together as an external content type (ECT). You can use an ECT to create external lists, much like a regular SharePoint list. However, instead of the content coming from
17
the SharePoint Content database, it is obtained indirectly, from the LOB definition in the model. Tooling improvements in BizTalk 2010 also mean that ECTs (and therefore their underlying model, LOB, entities, and methods) can be easily created in SharePoint Designer. For more general information on BCS, see here 3 .
Figure 10. BCS Architecture.
The generalization of data access through BCS and the stereotypes that it defines offers a powerful set of abstractions.
BCS essentially provides a data access layer for your SharePoint applications that is enabled for both
REST and SOAP web services by default. By normalizing the data access in this way SharePoint can commoditize the data that you define so that your users can get access to and share that data.
Probably the most useful and common scenario for the BCS is to surface external data on pages in a
SharePoint application. Out of the box, SharePoint Designer provides database support only for SQL
Server. Dig a little deeper though, and BCS offers access to a full range of data connectivity options.
Unfortunately using these options is a bit more involved, but the results are just the same: all the capabilities provided for SQL Server data integration are available with DB2 and Host Files as well. The runtime supports the following data access provider types:
3 http://msdn.microsoft.com/en-us/magazine/ee819133.aspx
18
Table 3. Data Connection Types.
Provider
SQL Server
Oracle
OLE DB
ODBC
Description
Native SQL access using .NET provider
Access using Microsoft’s Oracle client library
Any OLE DB provider such as DB2 or Host File
Any ODBC-compliant driver
To use the flexibility of BCS with the HIS data providers, a different approach from the SPD is necessary.
You have several options to achieve the required results:
1.
Work with a SQL Server copy of your DB2 database, and then edit the actual connection details once the ECT has been created.
2.
Create a linked server connection between SQL Server and DB2. SPD doesn’t support linked servers directly, so views must be created for each table required.
3.
Use the BCS API or web services to programmatically create the definitions.
4.
Manually create an XML model definition for the ECT and import it.
5.
Create a .NET connectivity assembly or custom connector .
6.
Create a Business Data Connectivity Model and edit the connection details.
None of these options is ideal, and none provides the same direct experience as SQL Server does with
SPD. However the same results can be achieved and this paper illustrates the API option and shows how you can programmatically create external content types in code. Later, the paper looks at how to update and change models later on.
One of the most powerful aspects of SharePoint is that much of its functionality is accessible not just through the browser but programmatically through a rich set of APIs. BCS is no exception to this, and understanding its API will help you to realize the number of different contexts within SharePoint (and outside of it) in which you can use BCS. For example, using the BCS API, data from external sources and applications can just as easily be integrated in SharePoint workflow or used to build sophisticated Web parts that can encapsulate a particular piece of functionality and enable it to be used and reused across applications. At this point even non-developers can create their own SharePoint applications by reusing your efforts.
The diagram in Figure 3 shows the key classes of the BCS API, which are related in a containment hierarchy.
19
Figure 11, BCS API Class Diagram.
Let’s look at some code to create a connection to DB2 that will allow us to create an external content type and later to easily create lists from SharePoint Designer.
Getting a Reference to the BCS Service
As BCS runs as a service application in SharePoint, you must obtain a reference to this service in order to interact with it.
1.
In Visual Studio create a new Visual C# Console Application, and name it CreateECT.
2.
As SharePoint is 64-bit only and based on .NET 3.5, you must change the project setup to reflect this. a.
Right-click the project in Solution Explorer and select Properties. b.
In the Project Properties dialog, select the Application tab and change the Target
Framework to .NET Framework 3.5. c.
Select the Build tab, and change the Platform target dropdown to Any CPU.
3.
Add a reference to the following two assemblies, located by default under C:\Program
Files\Common Files\Microsoft Shared\Web Server Extensions\14\ISAPI
Microsoft.SharePoint.dll
Microsoft.BusinessData.dll
20
4.
Delete the reference to Microsoft.CSharp.
5.
Add the following Using clauses to the top of the class file. using Microsoft.BusinessData.MetadataModel; using Microsoft.BusinessData.Runtime; using Microsoft.SharePoint; using Microsoft.SharePoint.Administration; using Microsoft.SharePoint.BusinessData.Administration; using Microsoft.SharePoint.BusinessData.Runtime; using Microsoft.SharePoint.BusinessData.SharedService;
6.
Add the following four lines to the Main method. These lines retrieve the BCS service and the underlying metadata storage catalog.
SPFarm farm = SPFarm .Local;
SPServiceProxyCollection spc = farm.ServiceProxies;
BdcServiceApplicationProxy ap = ( BdcServiceApplicationProxy )( ( from sp in spc where sp.TypeName.Equals( "Business Data Connectivity
Service" ) select sp).First().ApplicationProxies.First()); var catalog = ap.GetAdministrationMetadataCatalog();
7.
Now you can use the reference to the catalog to create a new model named, DB2_SAMPLE , as shown in the following code:
Model DB2Model = Model .Create( "DB2_SAMPLE" , true , catalog);
DB2Model.Properties.Add( "Discovery" , "" );
Creating an LOB System Definition
Next, you will use the administration API to create a new LOB system definition. As Figure 10 shows, the
LOB system definition is a container that is used to define access to the external system. Therefore you must also provide all required connection details to DB2. This is essentially the same set of connection details you used earlier with DVWP. BCS uses the convention that every property name is prefixed with
BdcConnection and it generates the connection string using the properties that you specify here.
1.
Add the following lines of code to the same class file:
LobSystem sampleLobSystem =
DB2Model.OwnedReferencedLobSystems.Create( "DB2_SAMPLE" , true ,
SystemType .Database);
LobSystemInstance lsi = sampleLobSystem.LobSystemInstances.Create( "DB2_SAMPLE" , true ); lsi.Properties.Add( "ShowInSearchUI" , "true" ); lsi.Properties.Add( "ConnectionName" , "DB2_SAMPLE_LOB" ); lsi.Properties.Add( "Discovery" , "" ); lsi.Properties.Add( "AuthenticationMode" , "PassThrough" ); lsi.Properties.Add( "Rdbconnection Trusted_Connection" , "yes" ); lsi.Properties.Add( "Rdbconnection Integrated Security" , "True" ); lsi.Properties.Add( "DatabaseAccessProvider" , "OleDb" ); lsi.Properties.Add( "Rdbconnection Provider" , "DB2OLEDB" ); lsi.Properties.Add( "Rdbconnection DBMS Platform" , "DB2/NT" );
21
22 lsi.Properties.Add( "Rdbconnection Host CCSID" , "37" ); lsi.Properties.Add( "Rdbconnection PC Code Page" , "1252" ); lsi.Properties.Add( "Rdbconnection Units of Work" , "RUW" ); lsi.Properties.Add( "Rdbconnection User ID" , "DB2ADMIN" ); lsi.Properties.Add( "Rdbconnection PWD" , "<PASSWORDHERE>" ); lsi.Properties.Add( "Rdbconnection APPC Security Type" , "PROGRAM" ); lsi.Properties.Add( "RdbConnection Data Source" , "127.0.0.1" ); lsi.Properties.Add( "RdbConnection Network Address" , "127.0.0.1" ); lsi.Properties.Add( "Rdbconnection Network Transport Library" , "TCP" ); lsi.Properties.Add( "Rdbconnection Package Collection" , "NULLID" ); lsi.Properties.Add( "Rdbconnection Network Port" , "50000" ); lsi.Properties.Add( "RdbConnection Initial Catalog" , "SAMPLE" ); lsi.Properties.Add( "RdbConnection Pooling" , "False" );lsi.Update();
To run this code you must substitute the password that you provided when you installed DB2 for
<PASSWORDHERE> in the Add method. If you didn’t choose a default installation you may also need to change the USER ID from DB2ADMIn as well.
2.
Next create the external content type itself. Enter the following lines of code to create a new entity named Sales and specify an identifier named SALES_PERSON for the entity :
Entity e = Entity .Create( "Sales" , "DB2_SAMPLE_LOB" , true ,
new Version ( "1.0.0.0" ), 10000, CacheUsage .Default,
sampleLobSystem, DB2Model, catalog); e.Identifiers.Create( "SALES_PERSON" , true , "System.String" );
ECTs were known as entities in MOSS 2007, and some of the language remains in SQL Server
2010, as you can see in this code.
3.
Now create a method on the entity using one of the provided stereotypes. In this case you'll create a Finder that returns all the data from the Sales table in the DB2 SAMPLE database. For this you must also specify the SQL statements to execute and the type descriptor. The type descriptor is a description of the data that is returned for each entity instance at run time. This equates to a row in the table and therefore must specify the columns and their data types.
Method getSalesMethod = e.Methods.Create( "GetSales" , true , false , "GetSales" ); getSalesMethod.Properties.Add( "RootFinder" , "" ); getSalesMethod.Properties.Add( "RdbCommandText" ,
"SELECT SALES_PERSON, REGION, SALES_DATE, SALES FROM
<SCHEMANAME>.SALES" ); getSalesMethod.Properties.Add( "RdbCommandType" , "Text" );
Parameter salesParameter = getSalesMethod.Parameters.Create( "Sales" , true ,
DirectionType .Return);
TypeDescriptor returnRootCollectionTypeDescriptor2 =
salesParameter.CreateRootTypeDescriptor( "Sales" , true ,
"System.Data.IDataReader,
System.Data, Version=2.0.0.0, Culture=neutral,
PublicKeyToken=b77a5c561934e089" ,
"Sales" , null , null , TypeDescriptorFlags .IsCollection, null , catalog);
TypeDescriptor returnRootElementTypeDescriptor2 =
returnRootCollectionTypeDescriptor2.ChildTypeDescriptors.Create( "Sales" , true ,
"System.Data.IDataRecord, System.Data, Version=2.0.0.0,
Culture=neutral,PublicKeyToken=b77a5c561934e089" , "Sales" , null , null ,
TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor2.ChildTypeDescriptors.Create( "SALES_PERSON" , true ,
"System.String" , "SALES_PERSON" , new IdentifierReference ( "SALES_PERSON" , new
EntityReference ( "DB2_SAMPLE_LOB" , "Sales" , catalog), catalog), null ,
TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor2.ChildTypeDescriptors.Create( "REGION" , true ,
"System.String" , "REGION" , null , null , TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor2.ChildTypeDescriptors.Create( "SALES_DATE" , true ,
"System.DateTime" , "SALES_DATE" , null , null , TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor2.ChildTypeDescriptors.Create( "SALES" , true ,
"System.Int32" , "SALES" , null , null , TypeDescriptorFlags .None, null );
MethodInstance mi = getSalesMethod.MethodInstances.Create( "GetSales" , true , returnRootCollectionTypeDescriptor2, MethodInstanceType .Finder, true ); mi.Properties.Add( "RootFinder" , "" );
4.
Add another method:, a specific finder. As its name implies this method must return a single row from the Sales table. The code to achieve this is exactly the same as before, except this time you add an input parameter to the SQL query that will be executed. In the following code you can see that the SALES_PERSON parameter is defined, and the ? parameter substitution in the query returns a single row.
Method m = e.Methods.Create( "GetSale" , true , false , "GetSale" ); m.Properties.Add( "RdbCommandText" ,
"SELECT SALES_PERSON, REGION, SALES_DATE, SALES FROM SALES WHERE
<SCHEMANAME>.SALES_PERSON LIKE ? FETCH FIRST 1 ROWS ONLY" ); m.Properties.Add( "RdbCommandType" , "Text" );
Parameter salesPersonIDParameter = m.Parameters.Create( "@SALES_PERSON" , true ,
DirectionType .In); salesPersonIDParameter.CreateRootTypeDescriptor( "SALES_PERSON" , true ,
"System.String" ,
"SALES_PERSON" , new IdentifierReference ( "SALES_PERSON" , new EntityReference ( "DB2_SAMPLE_LOB" , "Sales" , catalog), catalog), null , TypeDescriptorFlags .None, null , catalog);
Parameter saleParameter = m.Parameters.Create( "Sale" , true ,
DirectionType .Return);
TypeDescriptor returnRootCollectionTypeDescriptor =
saleParameter.CreateRootTypeDescriptor( "Sale" , true ,
"System.Data.IDataReader, System.Data, Version=2.0.0.0, Culture=neutral,
PublicKeyToken=b77a5c561934e089" , "Sales" , null , null ,
TypeDescriptorFlags .IsCollection, null , catalog);
TypeDescriptor returnRootElementTypeDescriptor = returnRootCollectionTypeDescriptor.ChildTypeDescriptors.Create( "Sale" , true ,
"System.Data.IDataRecord, System.Data, Version=2.0.0.0, Culture=neutral,
PublicKeyToken=b77a5c561934e089" , "Sale" , null , null , TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor.ChildTypeDescriptors.Create( "SALES_PERSON" , true ,
"System.String" , "SALES_PERSON" , new IdentifierReference ( "SALES_PERSON" , new
23
EntityReference ( "DB2_SAMPLE_LOB" , "Sale" , catalog), catalog), null ,
TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor.ChildTypeDescriptors.Create( "REGION" , true ,
"System.String" , "REGION" , null , null , TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor.ChildTypeDescriptors.Create( "SALES_DATE" , true ,
"System.DateTime" , "SALES_DATE" , null , null , TypeDescriptorFlags .None, null ); returnRootElementTypeDescriptor.ChildTypeDescriptors.Create( "SALES" , true ,
"System.Int32" , "SALES" , null , null , TypeDescriptorFlags .None, null ); m.MethodInstances.Create( "GetSale" , true , returnRootElementTypeDescriptor,
MethodInstanceType .SpecificFinder, true );
DB2Model.AddEntity(e); e.Activate();
Note the highlighted schema name <SCHEMANAME> in the preceding code. You’ll need to substitute your own schema name for this placeholder. As noted earlier the schema name is usually your username if you installed the DB2 SAMPLE database.
Running this console application creates the ECT and adds the two methods to a new Sales entity. Open
SharePoint Designer to see the ECT that is created.
1.
2.
Open SPD and connect to your site.
Click External Content Types in the Site Objects list. Click on the ECT that you just created.
You should see something similar to the following screen:
Figure 12, Created External Content Type.
24
3.
You can now create a new external list from this ECT. a.
Click the Create Lists and Form button on the ribbon. b.
Enter Sales as the List Name as shown in the following screen.
Figure 13, Create External List. c.
Click OK to create the external list.
4.
Now open your site in the browser, and you should see a new list, Sales, in the navigation bar.
Click it, and the page with the list on it appears, as shown in figure 13.
25
Figure 14. External List.
Creating the external list in SPD created a new web part page with the same name as the finder method – GetSales--and added an XSLTListView web part to it, configured to retrieve data from the ECT.
You can do much more with BCS. For example, you can create additional ECTs and define associations between them. This technique is useful for creating master/detail pages using two connected web parts.
In a typical scenario, a selected item from the first web part is passed into the second through parameters to drive the query it executes. SharePoint Server provides out-of-the-box business data web parts for this purpose. Although SharePoint provides a variety of BCS data web parts are all designed for read-only scenarios. Therefore knowing the BCS API and how to write your own web parts is extremely useful for building powerful composite SharePoint applications.
You must consider performance when you create ECTs. As you may have noticed, the queries provided in this section return all rows from the database. These rows are returned on every page refresh, even though only a subset of rows actually appears on the page. To limit the number of rows that a query returns, BCS provides filters. Filters allow you to define parameters to the WHERE clause (or to any other part of the query) to reduce the number of rows returned. Make sure that your queries are tuned for performance in this way to avoid placing unnecessary loads on both your database and your SharePoint servers. For more information of defining filters, see here 4 .
Remember that BCS entities are synonymous with external content types. An entity is a definition of an item from the catalog. In SharePoint 2010 you are no longer limited to the entity sources that
SharePoint provides, databases and web services. You can now create a custom .NET connector in Visual
Studio. Such a connector can handle interaction with a back-end data source or application and expose it to the BCS through the entities that you define and populate with that data. And once you have defined a connector in BCS you can consume it from the out-of-the-box web parts and other functionality in SharePoint.
This example uses the offline file feature of the HIS Host File Provider that was introduced in HIS 2009.
1.
Open Visual Studio 2010 and select Create New Project.
2.
Under Host Integration Projects, select Host File. Enter Banking for the name, and click OK.
3.
With the project opens, in Solution Explorer right-click Banking and select Add followed by Add
Host File Library. Provide the name BankingLibrary , and click OK.
4.
A wizard opens. Click Next.
5.
On the next dialog, accept the defaults and click Next.
6.
Click Create.
The library designer appears, allowing you to create a host column definition for a host file. This definition specifies the host file layout that you want to use. Often this definition already exists
4 http://msdn.microsoft.com/en-us/library/hh144965.aspx
26
but is embedded in a COBOL program or copybook. However, you can extract it automatically by using the Host Definition editor in Visual Studio. This example uses a COBOL program to define the columns of the file. To do this use the COBOL program in the HIS SDK.
7.
Right-click BankingLibrary in the designer and select Import, then Host Definition. Click Next.
8.
On the next dialog, click the Browse button and enter the following location
C:\Program Files\Microsoft Host Integration Server
2010\SDK\Samples\EndToEndScenarios\WoodgroveBank\3270Application\CICS3270Pgms
9.
Select the wgrvadda.cbl file, and click Open.
The COBOL program appears in the window.
10.
Uncheck the Use Importer Defaults option, and click Next.
11.
The program contains several definitions. This example uses the definition for Customer. To use this definition, uncheck the CUST-REC-KEY item and check CUSTOMER-RECORD, then click Next.
A CUSTOMER_RECORD schema appears in the designer. This schema is the definition of the customer extracted from the COBOL program.
12.
Now you’ll create a table from the schema so that you can set specific properties, such as the host file to use. Remember that you’re using the offline feature of HIS here, so you must specify the offline file to create. Before you create the offline file, right-click Tables and select Add
Table.
13.
In the Properties window, change the property values to the following:
Table 4, Host File properties
Property Name Value
Alias Customer
Host File Name HISDEMO.TXT
Schema CUSTOMER_RECORD
You should now have what is shown in the following screen.
27
Figure 15, Host File Designer.
14.
You must make a couple of changes to the project so that SharePoint can load the assembly at runtime. Right-click BankingLibrary in the designer, and select Properties.
15.
You need to sign the assembly and can do this by creating a key. To create a key, open a Visual
Studio command prompt as administrator and type the following command: sn – k c:\demo.snk
This command generates a key file that you can use to sign the assembly.
16.
Expand the AssemblyInformation property, and set the DelaySign property to False and the
keyFile property to c:\demo.snk
. Signing the assembly and adding to the General Assembly
Cache (GAC) ensures that SharePoint will trusts. You must sign the assembly because at runtime the host file provider must be loaded from within SharePoint.
17.
Save and close the solution. Save the library (CTRL + S) to generate the assembly.
18.
The final step is to create the actual host file that you’re going to use. To do this: a.
Open Notepad. b.
Entering the data provided in Appendix C, which is in EBCDIC. c.
Save the file, naming it HISDEMO.TXT. Remember the location where you saved it as you’ll need it for later examples.
This section shows how to create a custom .NET connector for retrieving data and displaying that data through entities that you create.
Designing the Model
The first task in creating the custom .NET connector is to create a new, empty SharePoint project.
1.
Open Visual Studio, and click New Project.
28
2.
Under Installed Templates, click Visual C#, then SharePoint.
3.
In the list of SharePoint templates click Business Data Connectivity Model, and enter
HostFileModel in the Name text box .
Then click the OK button.
The model designer opens and shows a single dummy entity named Entity1.
4.
Rename Entity1 to Customer in the model designer, and change the entity’s identifier from
Identifier1 to CUSTOMER_SSN.
5.
Now you need to add a field for each column in the customer file. First, open the BDC Explorer.
(If you can’t see it, open the View menu, then click Other Windows and BDC Explorer).
6.
Expand the ReadList method so that it looks like the following screen:
Figure 16. BDC Explorer.
As described in the earlier section "BCS Architecture," BCS entities consist of three principal parts: an identifier, type descriptors and methods. The identifier specifies a unique instance of the entity: for example, it might be the key for a row in a database. A type descriptor defines the parameters that are passed into or out of a method. The method itself is an instance of one of the stereotyped operations listed earlier in Table 2. By default, the project creates the ReadItem and ReadList stereotypes, which allow you to fetch a list of items or a single item by passing an identifier.
Matching Type Descriptors to Host File Definitions
You must now amend the type descriptors to specify the fields in the host file definition. To do this:
1.
Right-click the entity, then click Add Type Descriptor.
Type descriptors are hierarchical, with the leaf descriptors specifying the primitive type being passed. The default primitive type is string, which matches the types of the columns in the file except for CUSTOMER_ZIP which is an integer. Select System.Int32 to model this column.
29
2.
Make the same changes to the ReadItem method. The easiest way to do this is to use copy and paste. First copy and the type descriptor that you defined for ReadList and then paste it under the returnParameter in the ReadItem method. The following screen shows the end result:
Figure 17, BCS Explorer.
One final change that you need to make for later parts of this example is to add a custom property to
LOBSystemInstance. To do this:
1.
Select the node under LobSystemInstances in the BDC Explorer.
2.
In the Properties window, click the ellipsis next to Custom Properties.
3.
Add the property ShowInSearchUI and set its type to System.Boolean and its value to True.
(You will see how to search data later, but this property allows you to configure the model for searching in SharePoint.)
This completes the model. The model is the definition that BCS consumes and uses to interact with your connector.
30
Writing Implementation Methods
The next step is to write the implementation of the methods to fetch data from the host file.
1.
In Solution Explorer, find Entity1.cs, rename it to Customer.cs, and double-click the name to open the entity.
The following code appears: namespace HostFileModel.BdcModel1
{
/// <summary>
/// This class contains the properties for Entity1. The properties keep the data for Entity1.
/// If you want to rename the class, don't forget to rename the entity in the model xml as well.
/// </summary> public partial class Customer
{
//TODO: Implement additional properties here. The property Message is just a sample how a property could look like.
public string Identifier1 { get ; set ; } public string Message { get ; set ; }
}
}
This code defines the dummy entity that was added when the BCS Model was created. Now you need to change the code to include the fields in your model. The code you need appears below: namespace HostFileModel.BdcModel1
{ public partial class Customer
{ public string CUSTOMER_ACCESS_PIN { get ; set ; } public string CUSTOMER_CITY { get ; set ; } public string CUSTOMER_NAME { get ; set ; } public string CUSTOMER_PHONE { get ; set ; } public string CUSTOMER_SSN { get ; set ; } public string CUSTOMER_STATE { get ; set ; } public string CUSTOMER_STREET { get ; set ; } public Int32 CUSTOMER_ZIP { get ; set ; }
}
}
All this code does is create the fields in a class to hold an instance (a record in the file) of the entity. The types must match the source to ensure the data is represented correctly.
2.
Now implement the behavior. This is how you will actually retrieve the data and then create instances of the Customer entity (a list) to pass back to the BCS. To do this, open the
CustomerService class. Again, this class has a default implementation that creates a statically
31
populated instance of the original dummy entity. You will replace the code in the two methods,
ReadItem and ReadList.
3.
Add a reference to the Microsoft.HostIntegration.MsHostFileClient assembly and the following
using clauses to the top of the class: using Microsoft.HostIntegration.MsHostFileClient; using System.Data;
4.
Consider the following code. private static IEnumerable < Customer > GetData( string id)
{
DataSet result = new DataSet (); using ( HostFileCommand cmd = new HostFileCommand ( new
HostFileConnection (_conn)))
{ if ( null == id)
{
cmd.CommandText = "SELECT * FROM CUSTOMER" ;
} else
{
cmd.CommandText =
"SELECT * FROM CUSTOMER WHERE KEY (CUSTOMER_SSN) = ('" + id +
"')" ;
}
HostFileDataAdapter adapter = new HostFileDataAdapter ();
adapter.SelectCommand = cmd;
adapter.Fill(result);
}
Customer [] entityList = new Customer [result.Tables[0].Rows.Count]; int n = 0; foreach ( DataRow row in result.Tables[0].Rows)
{
Customer entity1 = new Customer ();
entity1.CUSTOMER_ACCESS_PIN = row[ "CUSTOMER_ACCESS_PIN" ] as string ;
entity1.CUSTOMER_CITY = row[ "CUSTOMER_CITY" ] as string ;
entity1.CUSTOMER_NAME = row[ "CUSTOMER_NAME" ] as string ;
entity1.CUSTOMER_PHONE = row[ "CUSTOMER_PHONE" ] as string ;
entity1.CUSTOMER_SSN = row[ "CUSTOMER_SSN" ] as string ;
entity1.CUSTOMER_STATE = row[ "CUSTOMER_STATE" ] as string ;
entity1.CUSTOMER_STREET = row[ "CUSTOMER_STREET" ] as string ;
entity1.CUSTOMER_ZIP = ( Int32 )row[ "CUSTOMER_ZIP" ];
entityList[n++] = entity1;
} return entityList;
}
Data access for both the ReadList method and the ReadItem method is wrapped in a single method that uses the HostFileCommand class. (Because the host file provider also supports offline reading, you can easily try this out for yourself without host connectivity.)
If you’ve written ADO.NET code before, this pattern should look familiar.
32
a.
The code creates an instance of the Command class, passing the connection string to the constructor and setting the SQL query to be executed. b.
Next it uses an adapter to fill a DataSet object with the results of the query. c.
Finally, it enumerates over the rows in the DataSet object and populates an array, which it returns as an IEnumerable<T> object to BCS.
33
5.
Add an overload for the GetData method that doesn’t require an identifier to be passed. This overload is used for the ReadList method. private static IEnumerable < Customer > GetData()
{ return GetData( null );
}
6.
Replace the ReadList and ReadItem method bodies as shown: public static IEnumerable < Customer > ReadList()
{ return GetData();
} public static Customer ReadItem( string id)
{ return GetData(id).First< Customer >();
}
These methods simply call the GetData method overloads with or without the identifier, as appropriate. The GetData method referred to a _conn variable, which is shown in the following code block. Place this code within the class definition itself. Three pieces of information are needed: the host file provider, the MetaData parameter that specifies the path to the metadata assembly, and the LocalFolder property that specifies where to find the offline data files. const string _conn = @"Provider=SNAOLEDB;Metadata='C:\Program Files\Common
Files\Microsoft Shared\Web Server
Extensions\14\TEMPLATE\LAYOUTS\HostFileModel\BankingLibrary.DLL';Local
Folder='C:\Program Files\Common Files\Microsoft Shared\Web Server
Extensions\14\TEMPLATE\LAYOUTS\HostFileModel\'" ;
7.
SharePoint 2010 makes it easy to package dependencies with your solution, so the next step is to add the HIS metadata assembly and the host file that you created earlier to the project. In
Solution Explorer right-click the HostFileModel project and select Add, then SharePoint
“Layouts” Mapped Folder. This step adds a new Layouts folder to the project with a
HostFileModel subfolder.
8.
Right-click the HostFileModel folder and select Add followed by Existing Item.
9.
Browse to the BankingLibrary.DLL file that you created earlier, and add it.
This step copies the assembly to the SharePoint Layouts folder when you deploy the model.
10.
Repeat steps 7 through 9, this time adding the HISDEMO.TXT data file.
Building and Testing the Solution
You can now build and deploy the solution to test it. In order to do this we can create a Web Part page in SharePoint and add out-of-the-box business data Web parts that SharePoint provides. These Web parts are available only with SharePoint Server and not with SharePoint Foundation.
To create a web part page:
1.
Open your site in a browser.
2.
Click Site Actions in the top-left part of the screen, and then click More Options… .
3.
Select Web Page, and click Create. Name the page Customers.aspx, and select the Full Page
Vertical layout.
4.
Change the Document Library value to Site Pages, and click Create.
5.
A new page opens for editing with a single container displaying Add a Web Part. Click this text to open the Web part gallery.
6.
The available web parts appear under the Business Data category. This collection is specifically designed to work with BCS. Select the Business Data List web part, and click Add. Your page should look like the following:
34
Figure 18, Web Part Page.
7.
Click the Open the tool pane link to configure the Web part.
8.
Under Business Data Actions click the Select External Content Type link.
Figure 19, External Content Type Picker.
In this screen you can pick the entity from your model. From here you can select the ECT that you just created.
9.
Click OK to close the edit pane, and select Stop Editing on the ribbon.
This procedure created an ECT in code that connects using the host file provider to read data and return it to BCS. You then created a page, added a web part, and configured the web part to use the ECT. End to end, when the page is rendered, the web part invokes BCS to call the ECT code, fetch the data, and display it. The following screen shows the final result:
Figure 20, Web Part Page.
35
As this section illustrates, custom connectors are powerful and flexible. It’s worth remembering that they support all the BCS features such as associations and filters as well allowing you to model relationships between entities and parameterize their usage allowing you to pass parameters into an underlying query or stored procedure
To improve this code, you should add error handling (omitted here for brevity) and remove the hardcoded connection string and SQL queries. Doing this would allow the connector to retrieve any data from any host file for both iSeries and zSeries hosts.
Nondevelopers can use tools rather than write code to work with models. One approach to consider is to create a model in code or in SharePoint Designer and then export the model to its XML model file format, bdcm. You can then open the model in Visual Studio and amend it as required. This technique makes scenarios such as working with DB2 (where SharePoint Designer can’t be used) much easier. For these situations you must change the database connection type in the bdcm file from OleDB to
SQLServer to ensure that the designer can render the model correctly, then change the connection type back to OleDB after making any changes.
The bdcm file can then be imported and exported n SharePoint using the browser:
1.
Go to the Central Administration site, and under Application Management click Manage Service
Applications.
2.
Click Business Data Connectivity Services in the service application list
3.
From the dropdown box at the top of the screen select BDC Models
4.
You can now export a model or import one.
The following image shows the Import and Export buttons that appear on the ribbon.
Figure 21, Import/Export BDC Model.
36
The Import and Export buttons allow you to export in the same file format that you use to create models in Visual Studio so that you can make round trips in your editing as well as keep your models under source and version control.
After the ability to pull external data into SharePoint, searching this data and receiving targeted results is the most important capability that SharePoint 2010 offers. Using SharePoint 2010 you can incorporate external data into search results along with many other data sources (including data held within
SharePoint itself). Like many features of SharePoint 2010, search is implemented as a shared service in the same way as BCS is. This architecture provides multiple ways to consume the service, including end user searches through browsers or web services and programmatic queries.
The following figure illustrates the search architecture in SharePoint 2010:
Figure 22. SharePoint Search Architecture.
This architecture is based on three basic operations:
1.
Define a content source (that is, a definition of the store to be searched). The SharePoint 2010 search service crawls this content source to produce the index.
2.
Build an index. The search service builds the Index as the output of its crawl operation and stores the index in SharePoint.
3.
Run queries against the index to return results, - including links to the content source (for example, documents that match the query criteria).
This section examines these steps in the context of external data such as the DB2 and Host File data sources discussed earlier in this section. Once you define an external data source like these in BCS you can easily integrate that data source with SharePoint’s search capabilities.
Note: BCS integration with Search is not part of SharePoint 2010 Foundation.
37
The first step to making external data searchable is to create a Search Service Application. To do this:
1.
Open the SharePoint Central Administration Site.
2.
Under Application Management, click Manage Service Applications.
3.
To create a new instance of the Search service, click New on the ribbon and select Search
Service Application, as shown in the following screen:
Figure 23, Creating a Search Service Application.
4.
A dialog for creating and configuring the Search service. Appears. a.
In the Name section, type the name DB2 Sales Data. b.
Under Application Pool for Search Admin Web Service enter the value
DB2_Search_Admin in the Application Pool Name field. c.
Leave the default Search Service Account as Network Service . d.
Under the Application Pool for Search Query and Site Settings Web Service section, enter DB2_Search_Query in the Application Pool Name field. e.
Click OK.
5.
SharePoint creates a new Search Service Application and the databases that actually hold the crawled and indexed data and configuration. This process takes a few minutes. Once it is complete you should see the new service you just created in the Service Applications list.
6.
Select the new service to open the Search Administration page, where you define the content source.
7.
Click the Content Sources link under the Crawling section in the navigation bar on the page, as shown in the following screen:
38
39
Figure 24, Creating a Content Source.
8.
On the Manage Content Sources page, click the New Content Source link to define the content source.
The following page appears:
Figure 25, Add Content Source.
The key part of this page is the Content Source Type. Selecting Line of Business Data displays the LOB system instances that are deployed. You can choose whether to crawl all BCS system instances or pick a particular one. This example selects the DB2_SAMPLE_LOB instance that you created earlier. If this external data source does not appear, click on Central Administration, then select Application Management and Configure service application associations. Select the web site that you are using and ensure that the Search Service application you just created is associated with the web site.
9.
Enter DB2 Sales Data for Name and check the Start Full Crawl option.
10.
Click OK to schedule the external data for indexing immediately.
11.
Once the indexing process is complete, click OK.
Defining the Scope
The next task required is to create a scope. A scope is a filter that lets you set up rules to define which content to include with or exclude from the query processor.
Creating a scope is a two-step process: create the scope, and then create rules for that scope. Use this procedure:
40
1.
Under Queries and Results in the Navigation pane, click Scopes.
The View Scopes screen appears.
2.
Select New Scope.
3.
Complete the page as shown in this screen, and click OK.
4.
On the View Scopes page, click the Add Rules link next to the new scope.
The following screen appears:
Figure 26, Creating a Scope.
This is the screen where you define which data is in or out of scope.
5.
Under Scope Rule Type, select Content Type and the DB2 Sales Data for the Content Source field. Ensure that Include is selected for Behavior, and click OK.
In this screen you are defining a new scope. This scope has one rule specifying that only data from the DB2 ECT that you created earlier is in scope for queries.
Indexing the Data
You can now crawl the DB2 data to index it for searching. Use this procedure:
1.
On the Manage Content Sources click the DB2 Sales Data content source, and select Start Full
Crawl, as shown in the following screen:
41
Figure 27. Starting a Crawl.
2.
Refresh the page to display the results of the crawl after the first full crawl finishes. The crawl should only take a few minutes.
Remember that a delay always occurs before new data in the content source appears in users’ search results because a new incremental crawl is required. You can run these incremental crawls on a schedule to balance server load and performance with fresh results.
Testing the Search
The easiest way to test the external data search you configured is to create new search site. Use the following procedure:
1.
Click Site Actions, then More Options, and select Enterprise Search Center as shown in the following screen:
42
Figure 28. Creating a Search Site.
The Site collection must have the SharePoint Publishing feature activated for this step to be successful.
2.
Enter the name for your site in the http://localhost/<URL name> text box as shown in the previous screen. This example uses the name Search .
3.
Click Create to create a new search site with a search page as shown in the following screen:
Figure 29. Search Page.
4.
Enter one of the cities from the DB2 Sales table in Sample database, and click the search button.
A results screen similar to the following one appears:
43
Figure 30. Search Results.
This section looked at the ways you can enable searching data sources that you define and model in BCS.
As you’ve seen, this task is actually pretty easy, and that is really the point. Many features in SharePoint are designed to work together: use one feature, and you enable another. By doing this you take full advantage of the SharePoint platform, using SharePoint's capabilities rather than just building on top of it.
The WCF channel for IBM WebSphere MQ (WCF MQ channel) has undergone some noteworthy improvements in HIS 2010. This section look at those improvements in the context of new features in
SharePoint 2010 as well as examining new features that are a good fit with a distributed, cross-platform, and asynchronous technology such as MQ -SharePoint workflow.
The WCF MQ channel provides a new request/reply binding, which the example in this section uses in conjunction with a SharePoint workflow. You can use the WebSphereMQRequestReply binding to implement RPC-style communication across applications and platforms using WebSphere MQ, as illustrated in Figure 8:
44
Figure 31. MQ Request/Reply Processing.
In this example, a library or list has a workflow associated with it. When a new item is added, a workflow instance is created that interacts with MQ, placing a message on queue A and receiving a reply back on queue B. In this way you can invoke an application that is hosted in another environment and exchange data with it.
One important consideration is to ensure that the workflow doesn't get blocked while waiting for a response from the application, using valuable server resources. Windows Workflow provides pluggable workflows for this purpose: they can use the CallExternalMethod Workflow activity make an asynchronous call to an arbitrary component and later continue once the callee has signaled completion with the HandleExternalEvent activity. These activities allow the workflow state to be serialized and stored allow the workflow to be unloaded from memory and reactivated later. (This feature has been in earlier versions of Windows Workflow, but prior to SharePoint 2010 it was not supported.)
CallExternalMethod allows you to specify a method to be called in a .NET assembly to implement a long-running process. Thus, it is the logical place for a request/reply MQ-based implementation.
However, the WCF MQ Channel is implemented using .NET 4.0, which SharePoint 2010 does not support. (SharePoint is based on .NET 3.5, which does not offer the ability to load a .NET 4.0 assembly.)
Therefore you cannot host the required functionality in SharePoint. Instead, you need a separate process to host the external method and event and to integrate the method and event with SharePoint.
Figure 9 illustrates the changes needed to the design:
45
Figure 32. Asynchronous Request/Reply with MQ.
Although several alternatives are available, such as creating a Windows service or using direct interprocess communication (IPC), but for simplicity the example in this section introduces a web service to broker the call between SharePoint workflow and MQ. To make this process efficient the example uses the NamedPipe binding to create an IPC mechanism from one process to another: the SharePoint
.NET 3.5 process and a .NET 4.0 one that loads the MQ channel assemblies. With IIS 7 you can use WAS to activate the host process and service.
In order to implement this scenario, the following steps are required:
1.
Create a list in SharePoint to hold data
2.
Create the MQ Service to push and receive messages with a downstream application
3.
Create the pluggable workflow component that the workflow will interact with
4.
Create the workflow itself and configure it
First, open a browser and navigate to your SharePoint site. Create a new list as follows
1.
Click Site Actions in the top left, and select More Options…
46
2.
Select Custom List in the dialog, and enter RoutedMessages for the name. Then click Create.
.
3.
You will now see the list. In the ribbon, click List Settings under the Settings tab In the Columns section of the page that is shown, and click Create Column.
4.
Enter Input for the column name and Number for the type, and then click OK.
5.
Follow steps 3-4 again and create another Number field named Result.
You will use this list to trigger the process and also to hold the results of the call.
After you create a SharePoint list, you create the MQ Service that will exchange messages with downstream application. Use the following procedure:
1.
Create a new project in Visual Studio 2010 and under WCF templates, choose the WCF Service
Application and name it MQService .
2.
Add a reference to the following assemblies, which are located in C:\Program Files\Microsoft
Host Integration Server 2010\System by default:
System.ServiceModel.Channels.WebSphereMQ.Headers.dll
System.ServiceModel.Channels.WebSphereMQ.Server.dll
System.ServiceModel.Channels.WebSphereMQ.Channel.dll
To simplify the example, this service is based on a sample in the HIS 2010 SDK, located at the following location:
C:\Program Files\Microsoft Host Integration Server
2010\SDK\Samples\MessageIntegration\MQChannel\RequestReplySample
This SDK sample implements two console applications, which use the MQ request/reply binding to send a message through MQ from one application to the other and then receive a response.
This example uses the server side of the sample so that SharePoint can send a message to it.
3.
Because this example invokes the sample code, you need the service contract. Add the following class file to the project:
C:\Program Files\Microsoft Host Integration Server
2010\SDK\Samples\MessageIntegration\MQChannel\RequestReplySample\RequestReplyClien tSample\Proxy\HelloWorldServiceImplementation.cs
4.
Open the auto-generated IService1.cs file, and change it as follows:
[ ServiceContract ] public interface IService1
{
[ OperationContract ] int PutMessage( int value);
}
5.
Ensure that the Service1.svc.cs file is open, and add the following using clauses: using System.ServiceModel.Channels; using System.ServiceModel.Channels.WebSphereMQ;
47
using System.ServiceModel.Channels.WebSphereMQ.Sample
6.
Add the following code to replace the class definition . Note the change of interface to implement: public class Service1 : IService1
{ public int PutMessage( int message)
{
CustomBinding mqBindingWithSoap = new CustomBinding ();
WebSphereMQRequestReplyTransportBindingElement mqTransportBindingElement = new WebSphereMQRequestReplyTransportBindingElement ();
mqTransportBindingElement.MqmdReplyToQueue = "ReplyQueue" ;
mqTransportBindingElement.MqmdReplyToQueueManager = "QM" ;
mqTransportBindingElement.MqcdConnectionName = "localhost:1414" ;
mqTransportBindingElement.MqcdTransportType = "TCP" ;
mqTransportBindingElement.ConnectionType = "Server" ;
TextMessageEncodingBindingElement textMessageEncodingBindingElement = new TextMessageEncodingBindingElement ();
mqBindingWithSoap.Elements.Add(textMessageEncodingBindingElement);
mqBindingWithSoap.Elements.Add(mqTransportBindingElement);
EndpointAddress remoteAdress = new EndpointAddress ( "net.mqs://localhost/QM/Messages" ); using ( HelloWorldServiceContractClient client = new HelloWorldServiceContractClient (mqBindingWithSoap, remoteAdress))
{ return client.SayHelloRequestReply(3);
}
}
}
This code is the service implementation that puts a message on the specified queue and waits for a response on the specified reply-to queue. The following table describes the settings in this code. (You may need to change these to suit your MQ installation.)
Table 5, MQ Settings.
Setting
Send to Queue
Reply to Queue
Send to Queue Manager
Reply to Queue Manager
Value
Messages
ReplyQueue
QM
QM
7.
Open the web.config file in the project and add the following XML tags under the
< system.serviceModel
> tag:
< bindings >
< netNamedPipeBinding >
< binding name = "" >
< security mode = " None "/ >
48
</ binding >
</ netNamedPipeBinding >
</ bindings >
This XML is required to enable the named pipe binding on the service.
8.
Deploy the service to make sure that everything is compiled and working. The easiest way to do this is to follow these steps: a.
Right-click the MQService project and select Publish.
The following dialog appears:
Figure 33,. Deploy Web Service. b.
In the dialog ensure that Web Deploy is selected from the Publish method dropdown, and provide the service URL (for example, http://localhost:8080 ) and the site application (for example, Default Web Site/MQService ). Also ensure that Mark as
IIS application on destination is checked. Then click the Publish button.
9.
Open a browser and enter the full URL (for example, http://localhost:8080/MQService/Service1.svc
) . You should see a screen similar the following:
49
Figure 34. MQ Service.
10.
Finally, you need to enable the service for WAS activation using the named pipe channel. To do this: a.
Open IIS Manager by pressing the Windows and R keys together and then entering
inetmgr). b.
Under Sites/Default Web Site/MQService click Advanced Settings… in the Actions
pane, and select http.net.pipe from the Enabled Protocols field as shown in the following screen.
50
Figure 35,. Setting Named Pipe Protocol in IIS.
Now create the actual workflow using the following procedure:
1.
Add a new SharePoint Sequential Workflow project to the solution and name it
MQPluggableWorkflow . You’ll need to provide a site URL, such as http://localhost for a local developer installation. Then click Next.
2.
Accept the default workflow name and the option to associate the workflow with a SharePoint list. Then click Next again
3.
. The next dialog asks which list the new workflow should be associated with. It defaults to
Shared documents, so change this to select the list that you created earlier, RoutedMessages
Then click Next.
4.
Accept the defaults on the conditions to start workflow dialog by clicking Finish.
The designer appears with a single OnWorkflowActivated activity. This activity provides access to the activation context: information such as the list item triggers the activation, which relates a workflow instance to a list item. Figure 36 adds CallExternalMethod and HandleExternalEvent activities to call another application through MQ. Finally, the figure adds a Code activity that updates the list in
SharePoint with the results of the call made through MQ. Notice the red exclamation marks on these activities showing that they are not yet correctly configured.(You'll do this later in the section
"Configuring the Workflow.")
Figure 36. Workflow Process.
Implementing an External Method
The next task is to implement the external method to call. To do this:
1.
Add a new class to the project, and name it CallService.cs
.
2.
Add a reference to the WCF service you just created. a.
Right-click the project in Solution Explorer, and select Add Service Reference.
51
b.
In the dialog that opens enter the service URL from the step 8b in the earlier section
"Creating an MQ Service." Then click OK to close the dialog.
3.
Add a reference to the System.ServiceModel assembly, then add the following using clauses to the top of the class file: using Microsoft.SharePoint.Workflow; using System.ServiceModel; using MQPluggableWorkflow.MQService; using System.Workflow.Runtime; using Microsoft.SharePoint; using System.Workflow.Activities;
This class must inherit from SPWorkflowExternalDataExchangeService as well as implement a service contract interface. For simplicity you’ll just add the required definition of the contract interface to the same file.
4.
Above the class definition add the following:
[ ExternalDataExchange ] public interface IMQService
{ event EventHandler < TrackingEventArgs > MQEvent; int DispatchMessage( int Input);
}
This interface defines the contract for the external method that the workflow will call.
5.
Change the CallService class definition to: class CallService : SPWorkflowExternalDataExchangeService , IMQService
6.
Under the class definition add the following line of code to define the event that will fire to signal that the external event has completed: public event EventHandler < TrackingEventArgs > MQEvent;
7.
Now add the implementation of the DispatchMessage method as follows: public int DispatchMessage( int message)
{ int result = 0;
System.Threading.
ThreadPool .QueueUserWorkItem( delegate ( object state)
{
NetNamedPipeSecurityMode securityMode = new NetNamedPipeSecurityMode ();
securityMode = NetNamedPipeSecurityMode .None;
NetNamedPipeBinding binding = new NetNamedPipeBinding (securityMode);
EndpointAddress address = new EndpointAddress ( "net.pipe://localhost/MQService/Service1.svc" );
binding.SendTimeout = new TimeSpan (0, 5, 0); // 5 minutes
MQService.
Service1Client client = new MQService.
Service1Client (binding, address);
result = client.PutMessage(message);
RaiseEvent(( SPWeb )(( object [])state)[1],
52
new System.
Guid ((( object [])state)[0].ToString()), typeof ( IMQService ), "MQEvent" , new object [] { result });
}
, new object [] { WorkflowEnvironment .WorkflowInstanceId, this .CurrentWorkflow.ParentWeb}); return result;
}
DispatchMessage calls the MQService that you created earlier. This call runs a separate .NET 4.0 process hosted under IIS to interact with MQ. When the service returns, an event is raised that triggers the workflow to continue through its HandleExternalEvent activity.
8.
Inheriting from SPWorkflowExternalDataExchangeService requires you to override the abstract methods CallEventHandler, CreateSubscription, and DeleteSubscription. Add these overrides after the DispatchMessage method: public override void CallEventHandler( Type eventType, string eventName, object []
eventData, SPWorkflow workflow, string identity,
System.Workflow.Runtime.
IPendingWork workHandler, object workItem)
{ if ( string .Equals(eventName, "MQEvent" , StringComparison .OrdinalIgnoreCase))
{ var args = new TrackingEventArgs (workflow.InstanceId);
args.Result = ( int )eventData[0]; this .MQEvent( null , args);
}
} public override void CreateSubscription( MessageEventSubscription subscription)
{ throw new NotImplementedException ();
} public override void DeleteSubscription( Guid subscriptionId)
{ throw new NotImplementedException ();
}
9.
The CallEventHandler method is invoked when the DispatchMessage fires the event through the HandleExternalEvent activity in the workflow. When the event is fired, an instance of the following class is used to pass the results from the call back to the workflow. Add this method to the same CallService.cs file under the CallService class definition:
[ Serializable ()] public class TrackingEventArgs : ExternalDataEventArgs
{ public TrackingEventArgs( Guid id) : base (id) { } public int Result;
}
53
Configuring the Workflow
Finally, you must configure the workflow you created earlier.
1.
Go back to Workflow Designer, and click the callExternalMethod1 activity to select it.
2.
In the Properties window, first set the InterfaceType and MethodName properties to the values in the following table. You can just copy and paste these values from the table to the
Properties window.
Table 6. Workflow Activity Settings.
Activity Property Value
callExternalMethodActivity1 handleExternalEventActivity1
InterfaceType MQPluggableWorkflow.IMQSer
vice
Input Activity=Workflow1,
Path=InputParameter
MethodName DispatchMessage
EventName MQEvent
InterfaceType MQPluggableWorkflow.IMQSer
vice e Activity=Workflow1,
Path=eventArgs
Notice that an Input property now appears. This is because the DispatchMessage method that you just added has an input parameter defined, and the input parameter is added to the list of properties.
3.
Double-click the Bind Property icon (the symbol) next to the Input property.
4.
In the dialog that appears, select the Bind to a new member tab, select Create Property, name the property InputParameter, and click OK.
5.
Right-click workflow1.cs in Solution Explorer, and select View Code.
6.
Find the new InputParameter property, and replace the InputParameter method body as shown:
[ DesignerSerializationVisibilityAttribute ( DesignerSerializationVisibility .Visible)
]
[ BrowsableAttribute ( true )]
[ CategoryAttribute ( "Parameters" )] public Int32 InputParameter
{ get
{
SPListItem item = workflowProperties.Item; if (item == null )
{ return -1;
} else
{ return System.
Convert .ToInt32(item[ "Input" ]);
}
}
}
54
7.
In the workflow designer, click the handleExternalEventActivity1 activity, and do the following: a.
Set theInterfaceType property to the value from the table in step 2. b.
Select MQEvent as the value of the EventName property from the dropdown.
c.
Finally, set the property, e, by binding it to a new property member called eventArgs.
8.
Double-click the code activity, and add the following code into the generated method so that you can track completion of the workflow from the associated list item.
9.
private void codeActivity1_ExecuteCode( object sender, EventArgs e)
{
SPListItem item = workflowProperties.Item;
item[ "Result" ] = eventArgs.Result;
item.Update();
}
Add the pluggable workflow assembly to the web.config file for the web site. By default, the assembly is located at C:\inetpub\wwwroot\wss\VirtualDirectories\80.
You now need to get the public key token value from the workflow assembly. Use this procedure:
1.
Build the solution to make sure it compiles cleanly.
2.
Open a Visual Studio command prompt as administrator, and change the directory to the
/bin/debug folder of your workflow project.
3.
Run the following command to get the key value: sn -T MQPluggableWorkflow.dll
4.
Find the <WorkflowServices> element, and add the following entry within it, substituting the
PublicKeyToken value that you copied and paste from the preceding sn command.
<WorkflowService Assembly="MQPluggableWorkflow, Version=1.0.0.0,
Culture=neutral, PublicKeyToken=[YOUR_TOKEN]"
Class="MQPluggableWorkflow.CallService"/>
Testing the Workflow
Now you are ready to test out your workflow.
1.
Right click the MQPluggableWorkflow project, and select Deploy Solution.
As mentioned earlier, this example uses the SDK sample RequestReplySample, located under the installation location of Host Integration Server 2010 at:
\SDK\Samples\MessageIntegration\MQChannel\RequestReplySample\
RequestReplyServerSample
2.
Build the solution.
55
3.
In Windows Explorer open the App.Config file for the RequestReplyServer project. The following table shows the attributes that you must change to fit your IBM WebSphere MQ installation and sample values for these attributes:
Table 7. MQ Binding.
Element/Attribute Value
/bindings/mqChannelRequestReplyBinding
@connectionType Server
@mqcdConnectionName localhost:1414
/services/service/endpoint@address
/client/endpoint/@address net.mqs://localhost/QM/Messages net.mqs://localhost/QM/Messages
This example uses two queues, Request and Reply, in two different queue managers, QM1 and
QM2, respectively, which were described earlier. You must set these up in IBM MQ Explorer if you wish to follow the example.
4.
Close the App.config file, and run the RequestReplyServer.exe file that you built. This action opens a console application, which waits for messages to arrive.
5.
Open the SharePoint site in a browser, and click on the RoutedMessages list in the navigation pane. You should see a page similar to the following:
56
Figure 37. Routed Messages List.
6.
Click the Add new item button, and fill in the Title as shown in the following screen:
Figure 38. Add New Item Dialog.
7.
Leave the Result field blank. The response from the MQ Server console application automatically populates this field.
8.
Click Save to close the dialog. This action starts a workflow instance.
9.
Once the workflow instance completes, you should see the list updated with the Result field populated as shown in the following screen:
Figure 39. Tracking Workflow Status in Routed Messages List.
This example showed how to integrate a SharePoint application with MQ Server using SharePoint workflow. As you saw this process has some technical obstacles to overcome, but queuing works well
57
with human workflow due to its asynchronous and long running characteristics. The new WCF MQ binding improvements for request/reply scenarios make this integration even more compelling, as calling applications through MQ, even across platforms, becomes straightforward.
For the final scenario this section looks at how to invoke host programs directly from SharePoint. To call a host program you must first create a definition of it using Transaction Integrator. This definition takes care of the input and output parameters and creates an assembly to wrap the host program so that the host program is easy to call from .NET. Appendix B shows the steps to create the required TI assembly.
With HIS’s offline capabilities you can run this scenario without a live host connection (or host), making setup straightforward.
Use this procedure to start the integration process:
1.
Create a new empty SharePoint project and name it CallHostWebPart .
2.
As with previous examples you must specify the site URL to use. In Solution Designer right-click the project and select Add, followed by New Item.
3.
Because you are creating a new web part, select Web Part from the list of items. Name the
Web part CallHostWebPart and click Add.
4.
The CallHostWebPart.cs file opens for editing. Add the following three fields directly under the class definition: private SPGridView grid; private string m_Name; private string m_PIN;
5.
In the CreateChildControls method, add the following code: protected override void CreateChildControls()
{ base .CreateChildControls();
grid = new SPGridView();
grid.AutoGenerateColumns = false ; this .Controls.Add(grid);
}
The SPGridView control provides a SharePoint-chromed version of the standard GridView control with behavior familiar to users of SharePoint.
To fulfil the goal of calling the host program as directly as possible, you must first create a reference directly in the Web part to an existing TI assembly. (Appendix B has instructions for creating the Web part before you create the reference as shown in this section.). This reference allows you to invoke the method on the interface that you created in the TI project. This invocation, in turn, calls the host program and returns the data that you want.
58
You‘ll then allow the data that the host program returns to be connected to another Web part that appears on the page. In the OnPreRender, event handler the WebPartManager calls the method before requesting the WebPart to render itself. Here you are creating an instance of the TI class in the TI assembly. The GETACCTS method invokes the host program that you saw earlier with the ACCTINFO structure that you defined in the TI Designer to hold the results.
Finally, you create an ADO.NET DataTable so that you can populate it with the results of the call and bind the DataGrid object to it.
This is the procedure for integrating the TI assembly into the web part:
1.
Create a TI assembly using the steps in Appendix B.
2.
Set a reference to the TI assembly you created using
3.
Add the following code: protected override void OnPreRender( EventArgs e)
{ var bank = new TINetBasic.
Accounts (); var accounts = bank.GetAccounts(m_Name, m_PIN);
System.Data.
DataTable table = new System.Data.
DataTable ();
table.Columns.Add( "ACCOUNTNUMBER" );
table.Columns.Add( "ACCOUNTTYPE" );
table.Columns.Add( "CURRENTBALANCE" );
table.Columns.Add( "INTERESTBEARING" );
table.Columns.Add( "INTERESTRATE" );
table.Columns.Add( "MONTHLYSVCCHG" ); for ( int i = 0; i < accounts.Length; i++)
{
table.Rows.Add(accounts[i].Number,
accounts[i].Type,
accounts[i].CurrentBalance,
accounts[i].InterestBearing,
accounts[i].InterestRate,
accounts[i].MonthlySvcChg);
}
System.Data.
DataSet ds = new System.Data.
DataSet ();
ds.Tables.Add(table);
grid.Columns.Clear(); foreach (System.Data.
DataColumn col in table.Columns)
{
SPBoundField fld = new SPBoundField ();
fld.HeaderText = col.ColumnName;
fld.DataField = col.ColumnName;
grid.Columns.Add(fld);
}
grid.DataSource = ds;
grid.DataBind();
}
The OnPreRender method runs before the web part is rendered – that is, before the web part is asked to emit its HTML for the page. This method gives you a place to put your code to obtain the data that you want to be rendered later.
59
Notice that the GetAccounts method in the second line of previous code has two parameters.
You could hard-code these parameters, but web parts provide another way to pass data: connectors. Connectors allow web parts to communicate with one another, passing data between them.
4.
Add following code to the class.
[ ConnectionConsumer ( "Account Name" , "AccountName" )] public void GetProviderAccountName( IWebPartField connectProvider)
{
FieldCallback callback = new FieldCallback (ReceiveName);
connectProvider.GetFieldValue(callback);
} public void ReceiveName( object objField)
{ if (objField != null ) { m_Name = ( string )objField; }
}
[ ConnectionConsumer ( "Account PIN" , "AccountPIN" )] public void GetProviderAccountPIN( IWebPartField connectProvider)
{
FieldCallback callback = new FieldCallback (ReceivePIN);
connectProvider.GetFieldValue(callback);
} public void ReceivePIN( object objField)
{ if (objField != null ) { m_PIN = ( string )objField; }
}
SharePoint offers a generic field passing capability through the IWebPartField interface. This example has two fields, AccountName and AccountPIN, that are decorated with the
ConnectionConsumer attribute. This attribute specifies a field name to be surfaced in the UI, and the corresponding method calls the Receivennnn method to obtain the value passed from the connected web part. The example sets two class level fields, m_Name and m_PIN, as input to the GetAccounts method.
5.
Add the following final piece of code. protected override void RenderContents(System.Web.UI.
HtmlTextWriter writer)
{
grid.RenderControl(writer);
}
This method overrides the RenderContents method that simply delegated to the SPGridView control that you added in the earlier CreateChildControls call.
You can now build and deploy the solution to make the new web part available for use in SharePoint.
Before you build and deploy, you must to set a few things up in HIS.
1.
First, open TI Manager. Open the Start menu and select All Programs, then Microsoft Host
Integration Server 2010.
60
2.
In TI Manager, under the Transaction Integrator node, click Windows-Initiated Processing.
3.
Right-click Remote Environments and select New , then Remote Environment.
4.
In the wizard that appears, click Next.
5.
Provide the name SimHost ELM Link, and click Next.
6.
Click Next to accept the host environment and programming model.
7.
On the Configure Endpoint TCP/IP dialog enter localhost for IP/DNS Address, and click the Edit button next to the port list.
8.
Enter 7511 for New Port, and click Add.
9.
Click OK to close the dialog.
10.
Click Next and then Finish to complete the remote environment setup.
11.
Now right-click the Objects node, and click New followed by Object... Then click Next.
12.
Browse to the TIHostApplicationDef project that you created by following the steps in Appendix
B, then click Next.
13.
Select Self-Hosted, and click Next.
14.
On the Remote Environment dialog select SimHost ELM Link, and click Next.
15.
On the Creation of WIP Objects dialog click Next, and then click Finish.
16.
The TI Manager should now look like the following screen:
Figure 40. Importing Definition In TI Manager
A new object has appeared with one method: GetAccounts. This method maps onto a host program that you can call through the defined interface. This example calls the GetAccounts method to retrieve a list of accounts.
17.
You can run this code offline without a host connection by using the HIS-provided Host
Simulator. To do this, open the Start menu and select All Programs, then Microsoft Host
Integration Server 2010, then Tools.
18.
When the Simulator opens, click the Work Spaces menu item, and select Load Work Space.
19.
Browse to the following file, again from the HIS SDK under C:\Program Files\Microsoft Host
Integration Server 2010, and click Open.
\SDK\Samples\ApplicationIntegration\WindowsInitiated\AppIntTutorials.SHF
61
20.
Right-click ELM Link Simulator in the simulator list, and select Start (see Figure 41), as shown in the following screen:
Figure 41. Host Simulator.
Next, create a page in SharePoint to host the new web part. Do this:
1.
Open a browser,
2.
Navigate to your site
3.
Selecting Site Actions , then More Options…
4.
In the dialog that opens select Web Part Page, and click Create. Name the page
Accounts.aspx
and select the Full Page, Vertical layout. Then click OK to create the page.
5.
A page appears showing a single web part zone. Click Add a Web Part and, under Custom in the
Categories section, select CallHostWebPart and click Add.
6.
Follow the same steps again and this time add two Text Filter web parts from under the Filters category. These filters are connected to your web part to provide the Account Name and PIN parameters.
Finally you need to connect the filter web parts to your web part. To do this:
1.
Click the drop-down on the right side of each filter web part in turn, and select Connections then Send Filter Values To and CallHostWebPart.
2.
Click Stop Editing in the ribbon.
A screen like the following appears showing the results:
62
Figure 42. Host Program Page.
In this example you've seen how to create a custom web part and implement the ability to call a host program that uses the TI capabilities of HIS 2010. Although the web part in this example was specific to the host program you were calling, with a little more work you can create a generic host program web part that you could use across projects and applications. The ability to create reusable web parts is another aspect of SharePoint development that makes composite applications compelling:nondevelopers or power users can ‘wire’ these pre-existing components together to create applications.
One thread that weaves through all of the scenarios discussed so far is security issues have not yet been discussed. When you connect to external systems you must consider several:
1.
What is the security mechanism of the target application/database/service?
2.
Which users in SharePoint should be allowed access to the resource, and what can they do?
3.
How sensitive data be kept from leaking through the security model?
SharePoint 2010 supports several types of security mechanisms for connecting to external systems, The primary mechanisms are Windows based, claims-based and non-Windows based. SharePoint hides these mechanisms from the users of the SharePoint application, so that everything maps to the
SharePoint security model.
The security method that you use is usually dictated by the system you’re connecting to. For example, if you consume data from DB2 you may want to provide the credentials in the connection and think about taking advantage of the Secure Store Service (SSS). This capability allows you to securely store sensitive information such as passwords and retrieve them safely. SSS provides the ability to map SharePoint
63
groups and users to the credentials needed to connect to the data source so that you can manage who has access to that data source. You map groups and users by creating a container, or application, to hold the necessary details, then retrieve these details through code (or through tools such as SharePoint
Designer).
You can secure all resources defined in BCS at any level of granularity from the model itself down to a particular stereotyped operation. This security enables you to control which users or groups of users can perform which operations, even though they are all mapped to the same SSS application.
One final concern about protecting the data that you expose relates to search. User access limits don't have the desired effect if the search crawler has access to the data source and then offers it in queries.
Fortunately, SharePoint’s designers realized this, and SharePoint automatically security-trims search results to strip out any data that the user is not permitted to see. Without this feature, users without the necessary permissions could view partial information in the results and compromise the data security.
This paper has examined some of the capabilities in SharePoint 2010 for connecting to IBM technologies. The paper looked at three aspects: accessing data, programs, and MQ, each using a different approach. It’s important to understand though that the techniques and technologies discussed in this paper are interchangeable. For example you could use the Data View Web Part from SharePoint
Designer to access host file data, update it and save it back, just as with DB2. You could also create a
.NET connector instead of using a web part to call host programs.
SharePoint 2010 places much greater focus on building composite applications that integrate external systems and data. With Business Connectivity Services, scenarios that were merely possible in MOSS
2007 are now much more approachable, and with the ability to extend its functionality, you are not limited by earlier technological limitations. Have fun experimenting.
Jon Fancey is a co-founder of Affinus, a Microsoft-focused consultancy specializing in hard-core development and integration, having deep expertise in the .NET framework, WPF, BizTalk Server and
SharePoint. Jon also provides on-site training through mvp-training.net. Contact Jon via his blog at jonfancey.com, twitter @jonfancey or at jon.fancey@affinus.com.
64
< udc:DataSource xmlns:udc ="http://schemas.microsoft.com/data/udc" MajorVersion ="2"
MinorVersion ="0">< udc:Name > Sales Data from
DB2 </ udc:Name >< udc:ConnectionInfo >< DataSourceControl > <![CDATA[<asp:SqlDataSource id="SqlDataSource1" runat="server" __designer:Provider="DB2OLEDB"
__designer:customcommand="true" ProviderName="System.Data.OleDb"
__designer:customconnectionstring="true"
ConnectionString="Provider=DB2OLEDB;Password=PWD;User ID=DB2ADMIN;Initial
Catalog=SAMPLE;Data Source=127.0.0.1;APPC Mode Name=QPCSUPP;Network Transport
Library=TCPIP;Host CCSID=37;PC Code Page=1252;Network Address=127.0.0.1;Network
Port=50000;Package Collection=NULLID;DBMS Platform=DB2/NT;Process Binary as
Character=False;Units of Work=RUW;" SelectCommandType="StoredProcedure"
SelectCommand="JON.GETSALES" />]]> </ DataSourceControl ></ udc:ConnectionInfo >< udc:Type
MajorVersion ="1" MinorVersion ="0" Type ="Sql"/></ udc:DataSource >
65
This appendix shows how to build a Transaction Integrator (TI) Assembly with Host Integrator Server
(HIS). Follow these steps:
1.
To begin, open Visual Studio, create a new project, and under Host Integration Projects select
Host Application, as shown in the following screen:
Figure 43. Add Host Integration Server Project.
A Host Application project creates the host program definitions and mappings to enable invocation from .NET. TI invokes host functionality using .NET by creating libraries. A library is a container for the type definitions that you’ll create and is packaged as a .NET assembly that loads the TI runtime when invoked.
66
2.
In the Name text box, enter TIHostApplicationDef and click OK.
3.
Right-click the project in Solution Explorer and click Add, then Add .NET Client Library.
The following screen appears:
Figure 44. Add new library.
4.
Select .NET Client Library, enter TINetBasic.DLL
in the Name textbox, and click Add.
The library defines your interfaces to the host programs. These interfaces are manifested as actual .NET interfaces with the ability to define methods with their respective parameters. A method is analogous to a host program with the parameters (in/out/inout or return value) representing the storage area (for example, COMMAREA for CICS) of the program that is used to communicate back and forth. Libraries create an integrated programming experience for .NET developers without requiring detailed knowledge of the host environment, languages, or communication.
5.
A wizard now opens. Click Next.
The following dialog appears:
67
68
Figure 45. Setting schema type restrictions.
Enter IAccounts for the Interface Name field and click Next.The default interface, IAccounts, is added as an interface type to the TI assembly.
6.
The Type Restrictions options limit the schema produced from the definition, as other products
(such as BizTalk) can be integrated through the BizTalk Adapters for Host systems (see references). Because the assembly will be consumed directly from a web part in this example, you can leave the Type Restrictions option set to None.
7.
Click Next.
The following screen appears:
69
Figure 46. Remote Environment Settings.
This screen details the remote environment options (that is, the host details). The defaults are
OK as they are, so click Next.
8.
Click Create on the final dialog to add the library to the TI project.
To simplify the process of creating the host interfaces, you can import a host program or copybook and extract the interface from the imported item. This example uses one of the SDKprovided host programs: a host implementation of a simple banking application. The source code for this application is provided in the SDK for both COBOL and RPG. This example uses the
COBOL source.
9.
Right-click the library in the designer and click Import, followed by Host Definition.
10.
In the dialog that appears, click the Browse button, and point to the following COBOL program deployed as part of the HIS SDK:
C:\Program files\Microsoft Host Integration Server 2010\SDK\Samples\ApplicationIntegration\
WindowsInitiated\SampleMainframeCode\TCP CICS MSLink\CICSGetAcctsLink.cbl
11.
The COBOL program appears in the dialog as shown in the following screen:
Figure 47. Importing COBOL Source.
Verify that the dialog shows the GETACCTS program. This program returns a list of customer bank accounts.
12.
Click Next to display the Item Options dialog:
70
Figure 48. Item Options.
This dialog specifies the type of item that is being defined. It has two principal options: as a method on the interface, or as a type. The type option allows you to specify the data types exchanged with the host program. As you’ll be calling the program, the Method option is required. The program name defaults to the value in the Name text box. This option adds the
GETACCTS method to the previously created interface. Ensure that Use Importer Defaults is checked and click Next.
The defaults ensure that the required COBOL definitions are automatically extracted.
13.
Click Finish to complete the wizard and add definitions to the library.
The designer provides several views over the new definition. You can view the generated .NET code (in C#) by clicking the Definition tab in the designer, or you can see all the code by pressing
ALT+F10. You can also see the host code (in COBOL or RPG depending on which language you selected) that was used to generate it. You can even start from scratch by creating the interface first, in which case the Host Data Definition tab contains the generated host code required to match it. This can simplify new development, as you can export the host code and then insert it into a new host program. Finally, an XML Schema (XSD) is also created for you. The schema is useful when you work with typed messages and BizTalk, especially when using the host adapters. The schema makes it straightforward to create correctly typed messages in BizTalk to pass to the adapter. If you choose to create BizTalk messages in this way, be sure to select the
71
correct type restrictions to ensure that the schema created is compatible with the technology you plan to use (WCF, ASMX, WCF or BizTalk).
Each method defined on the interface has an Include Context Parameter property, as shown in the following property sheet:
Figure 49. Library Properties.
In this example, when Include Context Param is set to True, a context parameter is added that can specify additional, per=call information. For example, you can override the host program
(GETACCTS) to be called from the defaults specified earlier. This feature supports dynamic scenarios where decisions in code may require routing to different end points.
This example does not use the Include Context Param property, so set it to False. Changing it to
True would add a ClientContext parameter to your method's signature which is of type
ClientContext. The ClientContext parameter can then be set programmatically during the setup of the call parameters and passed as a reference.
14.
Click the library name in the designer to highlight it. In the Properties window expand the
Assembly Information property, set the KeyFile property as shown in the following screen, and change the DelaySign property to False:
72
Figure 50. Set TI Assembly Strong Name.
15.
Open a Visual Studio command prompt by opening the Start menu and selecting All Programs, then Microsoft Visual Studio 2010. Then open Visual Studio Tools as an administrator by rightclicking it and selecting Run as administrator.
16.
Create a new key file to sign the assembly with by typing the following: sn – k c:\demo.snk
17.
Save the project to create the TINetBasic assembly, and close Visual Studio.
18.
You must now add this assembly to the GAC. In the Visual Studio command prompt, navigate to the project folder and enter the following.
GACUTIL – i TINetBasic.dll
73
Cut and paste into Notepad and save as HISDEMO.TXT
Á•„™…¦@Ä¥‰¢@@@@@@@@@@@@@@@@@@øøøùù÷÷÷÷öðòö@ôð£ˆ@Á¥…@ÕÅ@@@@â
…££“…@@@æÁ@@ùøññõôòõ÷òòñòóô@@@ñòóøÒ‰”@Á’…™¢@@@@@@@@@@@@@@@@
@@@@@ñññòòóóóóòðõðð@ÕÅ@ùø£ˆ@â£K@@@Ù…„”–
•„@@@æÁ@@ùøðõòôòõøøñóòññ@@@ñóõñÑ…††@ã…—
…™@@@@@@@@@@@@@@@@@@@@ñòóôõö÷øùòòõñ@Å““‰–
£@Á¥…•¤…@@â…££“…@@@æÁ@@ùøñðôòôõõõõðñ÷ó@@@ñöõôÁ”¨@⣙•„…@@@@@@@@
@@@@@@@@@@@ñññòòóóóóòñòñ@ÕÅ@ññò@ד@@@@@@×–
™£“•„@@ÖÙ@@ùôõõöõñóôôôõõõõ@@@õö÷ø
For more information: http://sharepoint.microsoft.com/ : SharePoint Web Site http://www.microsoft.com/hiserver/: Host Integration Server Web Site
Did this paper help you? Please give us your feedback. Tell us on a scale of 1 (poor) to 5
(excellent), how would you rate this paper and why have you given it this rating? For example:
Are you rating it high due to having good examples, excellent screen shots, clear writing, or another reason?
Are you rating it low due to poor examples, fuzzy screen shots, or unclear writing?
This feedback will help us improve the quality of white papers we release.
Send Feedback
74