And API documents (optional)
Please extract all items from the root directory of the document packages into the root directory of the data service site. However, if you do not wish to install the documents, they are also available here.
Note: the data service now run under .Net 4.5.1 and Asp.Net Mvc 5. One should be able replace the one old service by the new one without affecting any client applications that are still running under .Net 4.0.
Related articles:
Introduction
User identification/authentication and role based group authorization are required functionalities of many multi-user applications. ASP.NET websites are particular examples of such applications. We call a system supporting this collection of functionalities membership management system in the following
article.
A membership management system consists of at least a data source used to store information about
the user, role and user profile information, and a set of APIs that applications can interact with the data source to realize membership management.
The ASP.NET framework has a built-in, simple to use declarative mechanism for membership managements, provided that the applications have implemented and configured a set of provider classes derived from a set of predefined base classes. They are called custom providers in the sequel.
The present article describes a set of custom providers for a service based membership management system, which is shown in the following diagram:
Figure: Service based membership management system where object oriented membership, role and profile data engine runs as an independent service and communicate with applications using remote service calls.
The data service in itself is not the subject to be discussed in details in this article. Briefly, it is a custom build service based on a specific relational data schema shown in the following:
Figure: Schematic view of the relational data schema. An arrow here indicates both the direction of corresponding data dependency and 1 to many relationship. The direction of the arrows are opposite to the similar ones in a corresponding UML diagram.
It has a complete object oriented service API using which client applications can manipulate and query the relational data source without having to use any object relational mapping (ORM) framework. Also, it does not depend on a specific database engine. The downloads has a demo edition of the service, which is also hosted online (see here), it is based on an very simple persistable in-memory relational database of us for demo or testing purposes. It can be re-attached to a relational database engine if needed.
The demo site is initially configured to use the demo service provided online. However, for performance and/or other reasons, a reader might be interested in downloading and installing it onto his/her own machines to have a closer inspection using the above download link or the ones inside the demo site. This especially true if the reader tries to run the test project for the providers: first, the tests are designed to allow only one test process running against a data service; at a time the result of running multiple test processes at the same time against a shared data service is un-predictable; second, some tests are time sensitive, a sufficiently long network delay could cause them to fail.
The demo data service site contains more detailed information about the data service. Here is some of the unique features of the present approach that are relevant to this article:
- Applications and users in the present membership management system has many to many relationship. A user can be a member of multiple applications and an application can has multiple members. This feature is desired in many applications. Not using such a relationship will beat one of the major purposes of having a member management system running as a service.
- The role system is a hierarchical one. Namely a role can has a parent role and/or child roles. Albeit a hierarchical role system is complicated in itself, it could significantly simplify detailed authorization and access control logic in the client software.
- Centralized, service based membership management is a desired feature in many modern application scenarios.
The custom providers act as clients of the above mentioned data service, which has simple and intuitive access APIs. The documents for the API are contained inside the above mentioned demo data service website.
On the security side, hash algorithms MD5 and SHA1 are considered broken. The present membership provider uses the recommended SHA256 (HMACSHA256 to be precise) to enhance security of hashed passwords.
The article includes assemblies for the custom providers library, complete source code, unit-test code for the providers, code for a demo website, and the custom data service web-site for the corresponding relational data source and API documents.
Background
Microsoft provides default SQL providers that come with Visual Studio:
- Pre-ASP.NET MVC 4.0 ones are documented here, here and here, They are based on SQL server.
- ASP.NET MVC 4.0 uses a different default membership provider, called SimpleMembership (see here). It includes support for OAuth. It also is based on different editions of the SQL server.
- Universal Providers available via NuGet (see also here).
Reference here provides more detailed information about them.
They are kind of less sophisticated ones that are for introductory purposes. For example, they are implemented based on the assumption that a user belongs to an application (many to 1 relationship) as shown in the following diagram. The diagram is constructed based on the data schema of the supporting database of the Universal Providers. Here, the same real user using different applications must have different identities inside the same system, it could confuse the user and the security or personnel departments and it does not support an extensible multi-application system. Also the role system in there is also a little simple to support an extensible system. It will be explained in more details in the following.
Figure: The application <=> user relationship assumed in ASP.NET default providers and many other custom ones. Here an application has many users but a user can only belongs to one application.
To users with more demanding requirements (e.g. 1. auditing; 2. single sign on, which is also discussed here; 3. system extensibility or scalability; etc.), the underlying framework can not support their needs due to these restrictions, at least without substantial tweaking inside their own code bases. It creates
maintenance problems.
Besides the default ones provided by Microsoft, there are many custom implementations of these providers available on the web, including the ones on the CP. Most of the ones with source codes are not complete implementations of the provider APIs.
Some of the users of the ASP.NET might have implemented their own system and the others try to live with them by working awkwardly around the limitations of the default providers. It is hoped that at least to the later ones that the present contribution might provide certain values.
The Membership Provider
It implements the abstract class MembershipProvider
base class. The section introduces a few methods that walks the user through about how the data service API can be accessed. Details about the data service API is documented here. Visual Studio's intellisense can also be used to discover what's available.
Initialization
This method is called the first time the provider is referenced:
public sealed class AspNetMembershipProvider : MembershipProvider
{
....
public override void Initialize(string name, NameValueCollection config)
{
if (config == null)
throw new ArgumentNullException("config");
if (string.IsNullOrEmpty(name))
{
name = "AspNetMembershipServiceProvider";
}
if (String.IsNullOrEmpty(config["description"]))
{
config.Remove("description");
config.Add("description",
"Asp.Net Membership Service Provider");
}
base.Initialize(name, config);
pApplicationName = GetConfigValue(config["applicationName"],
System.Web.Hosting.HostingEnvironment.ApplicationVirtualPath);
... setting up other parameters according to values set inside the site
... Web.config file ...
lock (syncRoot)
{
if (_cctx == null)
_cctx = svc.SignInService(new CallContext(), null);
CallContext cctx = _cctx.CreateCopy();
Configuration cfg = WebConfigurationManager.OpenWebConfiguration(
System.Web.Hosting.HostingEnvironment.ApplicationVirtualPath);
machineKey = (MachineKeySection)cfg.GetSection("system.web/machineKey");
if (machineKey.ValidationKey.Contains("AutoGenerate"))
{
if (PasswordFormat != MembershipPasswordFormat.Clear)
{
throw new ProviderException("Hashed or Encrypted passwords " +
"are not supported with auto-generated keys.");
}
}
Application_ServiceProxy apprepo = new Application_ServiceProxy();
cctx.DirectDataAccess = true;
List<Application_> apps = apprepo.LoadEntityByNature(cctx, ApplicationName);
if (apps == null || apps.Count == 0)
{
cctx.OverrideExisting = true;
var tuple = apprepo.AddOrUpdateEntities(cctx, new Application_Set(),
new Application_[] {
new Application_ { Name = ApplicationName }
});
app =tuple.ChangedEntities.Length == 1 &&
IsValidUpdate(tuple.ChangedEntities[0].OpStatus) ?
tuple..ChangedEntities[0].UpdatedItem : null;
cctx.OverrideExisting = false;
}
else
app = apps[0];
}
if (app == null)
throw new Exception("Member provider initialization failed.");
}
... other methods ...
}
Here one need create a variable of type CallContext
and sign-in the service, like this:
lock (syncRoot)
{
if (_cctx == null)
_cctx = svc.SignInService(new CallContext(), null);
...
}
The returned value (of type CallContext
) is assigned to a global variable _cctx
, which will be used to create a copy each time a provider tries to access API methods. It contains per client information and must be initialized on the service side (the current version of the data service is not very strict about this but the future version will be). Since a website is multi-threaded, all three providers could tries to access the same sign-in methods at the same time, the lock ensures that only one call is needed. The initializer then check if the application named inside the
Web.config file (specified by the value of the "applicationName
" attribute of the provider node) exists inside the data source by calling the LoadEntityByNature
method. If it is not found create an entry for it by calling the AddOrUpdateEntities
method:
CallContext cctx = _cctx.CreateCopy();
...
Application_ServiceProxy apprepo = new Application_ServiceProxy();
cctx.DirectDataAccess = true;
List<Application_> apps = apprepo.LoadEntityByNature(cctx, ApplicationName);
if (apps == null || apps.Count == 0)
{
cctx.OverrideExisting = true;
var tuple = apprepo.AddOrUpdateEntities(cctx, new Application_Set(),
new Application_[] {
new Application_ { Name = ApplicationName }
});
app =tuple.ChangedEntities.Length == 1 &&
IsValidUpdate(tuple.ChangedEntities[0].OpStatus) ?
tuple..ChangedEntities[0].UpdatedItem : null;
cctx.OverrideExisting = false;
}
else
app = apps[0];
Class Application_ServiceProxy
is a proxy class for the Application_
set service. The class name of the service proxy follows the pattern:
EntityServiceClassName
:= <entity type name> + "ServiceProxy"
where <entity type name> is the class name of the entity inside the data source. For the present membership management system, they are
{ Application_
, Role
, User
,UserProfileType
, UserProfile
, UsersInRole
, UserAppMember
}
and
DataSourceServiceClassName
:= <data source name> + "ServiceProxy"
where <data source name> is the name for the data source. For this system, it is "AspNetMember
". This service interface is used to
manipulate the overall aspect of the data source.
One may have noticed the method LoadEntityByNature
that accepts application name as one of the arguments. This function load entities according to a set of intrinsic identifier (intrinsic id) of an entity. The concept of intrinsic id is one of the extensions of relational data schema introduced in our data systems The set of intrinsic id reflects the nature of the entity; an entity set is not allowed to have more than one entity having the same set of intrinsic ids. Intrinsic ids are immutable, even in distributed data stores. They therefor can not be an auto-generated primary key, but can be other kind of primary key, like a GUID.
Here, an Application_
is naturally identified by its name. Other entity sets listed above also have their own intrinsic ids assigned in the extended relational data schema. The AddOrUpdateEntities
method will enforce the following logic:
- If the client creates an entity and calls this method to have it added to the data source, then
- If an entity with the same set of intrinsic ids exists already, then an exception will be thrown unless the
OverrideExisting
property of cctx
is set to true. In the later case, the existing entity will be overwritten.
- Else, the entity is added to the data set.
- If the entity is loaded from the data source first, then, after certain processing, is sent to this method, it will be updated, if any changes exists.
Create a user
As the document says, it should add a new membership of a user within the current application to the data source.
public override MembershipUser CreateUser(string username,
string password,
string email,
string passwordQuestion,
string passwordAnswer,
bool isApproved,
object providerUserKey,
out MembershipCreateStatus status)
{
ValidatePasswordEventArgs args = new ValidatePasswordEventArgs(username,
password,
true);
OnValidatingPassword(args);
if (args.Cancel)
{
status = MembershipCreateStatus.InvalidPassword;
return null;
}
CallContext cctx = _cctx.CreateCopy();
try
{
UserSet us = new UserSet();
UserAppMemberSet ums = new UserAppMemberSet();
UserServiceProxy usvc = new UserServiceProxy();
User udata = null;
List<User> lu = usvc.LoadEntityByNature(cctx, username);
if (lu == null || lu.Count == 0)
{
if (providerUserKey != null &&
usvc.LoadEntityByKey(cctx, providerUserKey.ToString()) != null)
{
status = MembershipCreateStatus.DuplicateProviderUserKey;
return null;
}
if (RequiresUniqueEmail)
{
var x = GetUserNameByEmail(email);
if (!string.IsNullOrEmpty(x))
{
status = MembershipCreateStatus.DuplicateEmail;
return null;
}
}
DateTime createDate = DateTime.UtcNow;
if (providerUserKey == null)
{
providerUserKey = Guid.NewGuid();
}
else
{
if (!(providerUserKey is Guid))
{
status = MembershipCreateStatus.InvalidProviderUserKey;
return null;
}
}
udata = new User();
udata.IsPersisted = false;
udata.ID = providerUserKey.ToString();
udata.Username = username;
udata.Password = EncodePassword(password);
udata.PasswordFormat = pPasswordFormat.ToString();
udata.Email = email;
udata.PasswordQuestion = passwordQuestion;
udata.PasswordAnswer = passwordAnswer;
udata.IsApproved = isApproved;
udata.CreateOn = createDate;
udata.LastPasswordChangedDate = createDate;
udata.FailedPasswordAttemptCount = 0;
udata.FailedPasswordAttemptWindowStart = createDate;
udata.FailedPasswordAnswerAttemptCount = 0;
udata.FailedPasswordAnswerAttemptWindowStart = createDate;
udata.Status = us.StatusValues[0];
UserAppMember memb = new UserAppMember();
memb.ApplicationID = app.ID;
memb.UserID = udata.ID;
memb.MemberStatus = ums.MemberStatusValues[0];
memb.LastStatusChange = createDate;
memb.LastActivityDate = createDate;
memb.Comment = "";
udata.ChangedUserAppMembers = new UserAppMember[] { memb };
var v = usvc.AddOrUpdateEntities(cctx, us, new User[] { udata });
status = v.ChangedEntities.Length == 1 && IsValidUpdate(v.ChangedEntities[0].OpStatus) ?
MembershipCreateStatus.Success : MembershipCreateStatus.DuplicateUserName;
MembershipUser user = GetUserFromModel(udata, memb);
return user;
}
else if (CheckPassword(password, lu[0].Password, lu[0].PasswordFormat))
{
DateTime createDate = DateTime.UtcNow;
udata = lu[0];
if (udata.Email != email)
{
udata.Email = email;
udata.IsEmailModified = true;
usvc.EnqueueNewOrUpdateEntities(cctx, us, new User[] { udata });
}
UserAppMemberServiceProxy membsvc = new UserAppMemberServiceProxy();
UserAppMember memb = membsvc.LoadEntityByKey(cctx, app.ID, udata.ID);
if (memb != null)
{
status = MembershipCreateStatus.Success;
return GetUserFromModel(udata, memb);
}
else
{
memb = new UserAppMember();
memb.IsPersisted = false;
memb.ApplicationID = app.ID;
memb.UserID = udata.ID;
memb.MemberStatus = ums.MemberStatusValues[0];
memb.LastActivityDate = createDate;
membsvc.AddOrUpdateEntities(cctx, ums, new UserAppMember[] { memb });
status = MembershipCreateStatus.Success;
return GetUserFromModel(udata, memb);
}
}
else
{
status = MembershipCreateStatus.DuplicateUserName;
return null;
}
}
catch (Exception e)
{
if (WriteExceptionsToEventLog)
{
WriteToEventLog(e, "CreateUser");
}
status = MembershipCreateStatus.UserRejected;
return null;
}
finally
{
}
}
After validating the password, the method first checks if a user with Username
(which is the sole intrinsic
ID) equals to username
exists or not:
UserServiceProxy usvc = new UserServiceProxy();
User udata = null;
List<User> lu = usvc.LoadEntityByNature(cctx, username);
if (lu == null || lu.Count == 0)
{
... case 1: not found ...
}
else if (CheckPassword(password, lu[0].Password, lu[0].PasswordFormat))
{
... case 2: found and supplied a valid existing password ...
}
else
... case 3: reject ...
then it proceeds differently according to the results found. There are three possibilities:
- Case 1: A user with the specified
username
is not found. The method will try to add a user record and a membership record for the current application into the data source:
User udata = null;
...
{
... check the validity of various input parameters and set the
... status if neccessary according to the documents ...
udata = new User();
.. assign various properties of udata ...
UserAppMember memb = new UserAppMember();
.. assign various properties of memb ...
udata.ChangedUserAppMembers = new UserAppMember[] { memb };
var v = usvc.AddOrUpdateEntities(cctx, us, new User[] { udata });
status = v.ChangedEntities.Length == 1 && IsValidUpdate(v.ChangedEntities[0].OpStatus) ?
MembershipCreateStatus.Success : MembershipCreateStatus.DuplicateUserName;
MembershipUser user = GetUserFromModel(udata, memb);
return user;
}
Note that the User
entity has a dependency set UserAppMembers
according to the data schema. The corresponding ChangedUserAppMembers
property of a User
entity is used to build an interdependent entity graph to have it added or updated into the data source. Dependency sets, if any, of any entity all have a corresponding property named in the same pattern that can be used to update an entity graph of any complexity.
- Case 2: The user is found, it could be that an existing user of the system is trying to join the current application.
In order to prevent one user who knows the username of another user trying to take over the latter's account,
the method checks the password supplied. If the password matches, then
UserAppMemberServiceProxy membsvc = new UserAppMemberServiceProxy();
UserAppMember memb = membsvc.LoadEntityByKey(cctx, app.ID, udata.ID);
if (memb != null)
{
status = MembershipCreateStatus.Success;
return GetUserFromModel(udata, memb);
}
else
{
memb = new UserAppMember();
.. assign various properties of memb ...
membsvc.AddOrUpdateEntities(cctx, ums, new UserAppMember[] { memb });
status = MembershipCreateStatus.Success;
return GetUserFromModel(udata, memb);
}
- Case 3: Somebody who supplied a wrong password is trying to register for an existing member, reject!
Delete a user
Removes a user from the application's membership records inside data source. Since user and application has a many to many relationship in the current system, deleting an actual user is not the responsibility of a particular application. It must be done at a higher level, like inside the data service manager.
public override bool DeleteUser(string username, bool deleteAllRelatedData)
{
CallContext cctx = _cctx.CreateCopy();
UserServiceProxy usvc = new UserServiceProxy();
try
{
List<User> l = usvc.LoadEntityByNature(cctx, username);
if (l == null || l.Count == 0)
return false;
User u = l[0];
UserAppMemberServiceProxy msvc = new UserAppMemberServiceProxy();
UserAppMember memb = msvc.LoadEntityByKey(cctx, app.ID, u.ID);
msvc.DeleteEntities(cctx, new UserAppMemberSet(), new UserAppMember[] { memb });
if (deleteAllRelatedData)
{
UserProfileServiceProxy upsvc = new UserProfileServiceProxy();
UserProfileSet ps = new UserProfileSet();
UserProfileSetConstraints upcond = new UserProfileSetConstraints
{
ApplicationIDWrap = new ForeignKeyData<string> { KeyValue = app.ID },
TypeIDWrap = null, UserIDWrap = new ForeignKeyData<string> { KeyValue = u.ID }
};
var pl = upsvc.ConstraintQuery(cctx, ps, upcond, null);
if (pl.Count() > 0)
{
upsvc.DeleteEntities(cctx, ps, pl.ToArray());
}
UsersInRoleServiceProxy uisvc = new UsersInRoleServiceProxy();
UsersInRoleSetConstraints uircond = new UsersInRoleSetConstraints
{
RoleIDWrap = null,
UserIDWrap = new ForeignKeyData<string> { KeyValue = u.ID }
};
var lir = uisvc.ConstraintQuery(cctx, new UsersInRoleSet(), uircond, null);
if (lir.Count() > 0)
{
uisvc.DeleteEntities(cctx, new UsersInRoleSet(), lir.ToArray());
}
}
return true;
}
catch (Exception e)
{
if (WriteExceptionsToEventLog)
{
WriteToEventLog(e, "DeleteUser");
throw new ProviderException(exceptionMessage);
}
else
{
throw e;
}
}
finally
{
}
}
The DeleteEntities
method of each entity service proxy deletes not only the entity itself, but also all set of entities that directly depend on it (namely, the ones that the arrow points to in the above schema diagram), and it is done recursively until the entire entity graph starting from the initial entity is removed. This is sometimes not enough, because although the deleted entity UserAppMember
has no direct dependency set according to the data schema, a member of an application has indirect dependencies, namely the UserProfile
and UsersInRole
sets. The deleteAllRelatedData
parameter of the DeleteUser
method controls whether or not these member associated data should be removed. To find all associated entities, one should call the ConstraintQuery
method of the corresponding service to list all the associated data first, then delete the set:
UserProfileServiceProxy upsvc = new UserProfileServiceProxy();
UserProfileSet ps = new UserProfileSet();
UserProfileSetConstraints upcond = new UserProfileSetConstraints
{
ApplicationIDWrap = new ForeignKeyData<string> { KeyValue = app.ID },
TypeIDWrap = null, UserIDWrap = new ForeignKeyData<string> { KeyValue = u.ID }
};
var pl = upsvc.ConstraintQuery(cctx, ps, upcond, null);
if (pl.Count() > 0)
{
upsvc.DeleteEntities(cctx, ps, pl.ToArray());
}</string>
and
UsersInRoleServiceProxy uisvc = new UsersInRoleServiceProxy();
UsersInRoleSetConstraints uircond = new UsersInRoleSetConstraints
{
RoleIDWrap = null,
UserIDWrap = new ForeignKeyData<string> { KeyValue = u.ID }
};
var lir = uisvc.ConstraintQuery(cctx, new UsersInRoleSet(), uircond, null);
if (lir.Count() > 0)
{
uisvc.DeleteEntities(cctx, new UsersInRoleSet(), lir.ToArray());
}
Query users
There is a unified "syntax" for query relational data sources for the data services. It is independent of the underlying data store or database.
The query methods of the service API receive an instance of QueryExpresion
, which, instead of being a string expression, it is a data structure that is composed of a list of token data of type QToken
. It must be correctly constructed in
order for the call to be successful. A programmer that is already familiar with the schema of relational data source and the present system may construct it easily. For others, it could require a certain amount of trial and error before getting it right.
There is an easier way, however. A user can go to the "Data Source" tab page of the service manager, selecting the desired data set and using the smart query guidance system to construct the sorting and filtering conditions interactively there (e.g., here). After a proper expression is found, generate the C# code block for the expression by click the corresponding button on the right.
Let's examine the GetNumberOfUsersOnline
method in a little more details
public override int GetNumberOfUsersOnline()
{
AspNetMemberServiceProxy svc = new AspNetMemberServiceProxy();
TimeSpan onlineSpan = new TimeSpan(0,
System.Web.Security.Membership.UserIsOnlineTimeWindow, 0);
DateTime compareTime = DateTime.UtcNow.Subtract(onlineSpan);
UserAppMemberServiceProxy umsvc = new UserAppMemberServiceProxy();
CallContext cctx = _cctx.CreateCopy();
try
{
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[]
{
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = "desc" }
});
qexpr.FilterTks = new List<QToken>(new QToken[]{
new QToken { TkName = "ApplicationID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"" + app.ID + "\"" },
new QToken { TkName = "&&" }
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = ">" },
new QToken { TkName = svc.FormatRepoDateTime(compareTime) }
});
int users = (int)umsvc.QueryEntityCount(cctx, new UserAppMemberSet(), qexpr);
return users;
}
catch (Exception e)
{
if (WriteExceptionsToEventLog)
{
WriteToEventLog(e, "GetNumberOfUsersOnline");
throw new ProviderException(exceptionMessage);
}
else
{
throw e;
}
}
finally
{
}
}
What does the expression mean? Take for example that when the parameters are app.ID = "713a...." and compareTime = July 28, 2013 at 00:05:12 Local time (time zone +8:00) in the above code block, then the string form of filter expression is ApplicationID == "713a...." && LastActivityDate > 2013-07-28 00:05:12 Ltc, which in literally means: "find all users in user set where" [ApplicationID equals to "713a...." and LastActivityDate is greater than 2013/7/27 16:05:12 in Universal Time Coordinate]. Here the literal string inside [] can also be generated by the system. Here Ltc means to take the local time coordinate and Utc means to take the universal time coordinate.
Note that if we use the above parameters to construct the query expression interactively inside the service manager, the generated expression is more complicated:
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = "desc" }
});
qexpr.FilterTks = new List<QToken>(new QToken[] {
new QToken { TkName = "ApplicationID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"713ab5b4-0a24-499d-bca9-a29c72227d82\"" },
new QToken { TkName = "&&" },
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = ">" },
new QToken { TkName = "2013" },
new QToken { TkName = "-" },
new QToken { TkName = "07" },
new QToken { TkName = "-" },
new QToken { TkName = "28" },
new QToken { TkName = "00" },
new QToken { TkName = ":" },
new QToken { TkName = "05" },
new QToken { TkName = ":" },
new QToken { TkName = "12" },
new QToken { TkName = "Ltc" }
});
which is different from above code since the date and time value is not a one token but consists of a series of smaller tokens. This is not a problem since the above is equivalent to the following:
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = "desc" }
});
qexpr.FilterTks = new List<QToken>(new QToken[] {
new QToken { TkName = "ApplicationID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"713ab5b4-0a24-499d-bca9-a29c72227d82\"" },
new QToken { TkName = "&&" },
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = ">" },
new QToken { TkName = "2013 - 07 - 28 00 : 05 : 12 Ltc" }
});
Namely, one can merge smaller tokens into a larger one by concating several tokens into one by separating them using one or more space character. If you like it, it can also be expressed in a more traditional way as one string, namely:
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "LastActivityDate" },
new QToken { TkName = "desc" }
});
qexpr.FilterTks = new List<QToken>(new QToken[] {
new QToken
{
TkName =
"ApplicationID == \"713ab5b4-0a24-499d-bca9-a29c72227d82\" &&
LastActivityDate > 2013-07-28 00:05:12 Ltc"
}
});
Here, "Ltc" represents local time and the date (2013-07-28) and time (00:05:12) parts do not really need to have a space to separate them.
You may already notice that the present system uses a special format to express date and time. For date and time in other formats, it is recommended that one should create a .Net DateTime
object from it and the format the DateTime
object by calling the FormatRepoDateTime
method of the service for the overall relational data source.
The Role Provider
The role system is a hierarchical one in the current solution. An entity inside hierarchical entity set may depend on another entity in the same set. So a role may have a parent role and/or a set of child roles. This complicates the
implementation of the provider a little bit.
This however allows the following logic to be implemented natively: namely a
user having a role has also all the roles corresponding to all (if any) of the
direct or indirect parent roles of the said role. For example, a user having
role Administrators
.System
can not only access
resources requiring Administrators.System
role but also the ones
requiring Administrators
role. However, a user having "Administrators
"
role can not access more restrictive resources requiring
Administrators.System
role.
This is a more extensible system. Suppose after a site using the present providers has been deployed a client and the client want to add a new role, e.g.,
HumanResource
role under the Administrators
role
(category), then there is nothing more to do to allow users in the new role to
access resources that all users in roles under the Administrators
role (category) is allowed. While in a flat role system, either all the parent
roles needs to be added to the user who is assigned to the new role, which is
error prone as the system evolves, or the code for the site has to be modified
to add the new role (namely, the Administrators
.HumanResource
role) to the Authorize
attribute inside the code, which is not available to the client.
Let's examine a few methods to see how it works.
GetRolesForUser
Gets a list of the roles that a specified user is in for the configured applicationName.
public override string[] GetRolesForUser(string username)
{
CallContext cctx = _cctx.CreateCopy();
try
{
User u = findUser(username);
if (u == null)
return new string[] { };
RoleServiceProxy rsvc = new RoleServiceProxy();
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "RoleName" }
});
qexpr.FilterTks = new List<QToken>(new QToken[]{
new QToken { TkName = "ApplicationID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"" + app.ID + "\"" },
new QToken { TkName = "&&" },
new QToken { TkName = "UsersInRole." },
new QToken { TkName = "UserID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"" + u.ID + "\"" }
});
var roles = rsvc.QueryDatabase(cctx, new RoleSet(), qexpr);
List<string> lrns = new List<string>();
foreach (Role r in roles)
{
if (r.ParentID != null)
{
Stack<Role> srs = new Stack<Role>();
Role pr = r;
while (pr != null)
{
srs.Push(pr);
var p = rsvc.MaterializeUpperRef(cctx, pr);
pr.UpperRef = p;
pr = p;
}
while (srs.Count > 0)
{
string rp = rolePath(srs.Pop());
if (!lrns.Contains(rp))
lrns.Add(rp);
}
}
else
{
string rp = rolePath(r);
if (!lrns.Contains(rp))
lrns.Add(rp);
}
}
return lrns.ToArray();
}
finally
{
}
}
The query expression:
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "RoleName" }
});
qexpr.FilterTks = new List<QToken>(new QToken[]{
new QToken { TkName = "ApplicationID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"" + app.ID + "\"" },
new QToken { TkName = "&&" },
new QToken { TkName = "UsersInRole." },
new QToken { TkName = "UserID" },
new QToken { TkName = "==" },
new QToken { TkName = "\"" + u.ID + "\"" }
});
It literally means that to find all roles for the current application (ApplicationID == app.ID
) that the current user is in (UsersInRole.UserID == u.ID
) from the Roles
entity set, order them by RoleName
. After calling QueryDatabase
, a list of roles that are
explicitly assigned to the user are obtained. This is not enough to realize our logic. Since if a user is in a role, then his/she is in all the parent roles of that explicit role. So for each these explicit roles:
if (r.ParentID != null)
{
Stack<Role> srs = new Stack<Role>();
Role pr = r;
while (pr != null)
{
srs.Push(pr);
var p = rsvc.MaterializeUpperRef(cctx, pr);
pr.UpperRef = p;
pr = p;
}
while (srs.Count > 0)
{
string rp = rolePath(srs.Pop());
if (!lrns.Contains(rp))
lrns.Add(rp);
}
}
else
{
string rp = rolePath(r);
if (!lrns.Contains(rp))
lrns.Add(rp);
}
The code section for the case in which the explicit role have a parent role processes to find all the parent roles of it by repeatedly making
calls to MaterializeUpperRef
method of the entity set service and add the returned parent roles to the list of roles that the user has.
As it is shown here, the pattern of the method name using which to load an entity that the current entity is depending upon has the following pattern:
"Materialize" + <property name>
where <property name> is the name of the property corresponding to the entity that the current entity is depending upon, here it is UpperRef
.
GetUsersInRole
Gets a list of users in the specified role for the configured applicationName.
public override string[] GetUsersInRole(string rolename)
{
CallContext cctx = _cctx.CreateCopy();
try
{
Role r = findRole(rolename);
if (r == null)
return new string[] { };
RoleServiceProxy rsvc = new RoleServiceProxy();
var ra = rsvc.LoadEntityHierarchyRecurs(cctx, r, 0, -1);
List<string> luns = new List<string>();
_getUserInRole(cctx, ra, luns);
return luns.ToArray();
}
finally
{
}
}
private void _getUserInRole(CallContext cctx, EntityAbs<Role> ra,
List<string> usersinrole)
{
UserServiceProxy usvc = new UserServiceProxy();
QueryExpresion qexpr = new QueryExpresion();
qexpr.OrderTks = new List<QToken>(new QToken[] {
new QToken { TkName = "Username" }
});
qexpr.FilterTks = new List<QToken>(new QToken[] {
new QToken { TkName = "UsersInRole." },
new QToken { TkName = "RoleID" },
new QToken { TkName = "==" },
new QToken { TkName = "" + ra.DataBehind.ID + "" }
});
var users = usvc.QueryDatabase(cctx, new UserSet(), qexpr);
foreach (User u in users)
usersinrole.Add(u.Username);
if (ra.ChildEntities != null)
{
foreach (var c in ra.ChildEntities)
_getUserInRole(cctx, c, usersinrole);
}
}
The logic of this method is somewhat opposite to the one in GetRolesForUser
, namely the users in a particular role are the ones who has the role
explicitly plus all the users who are explicitly in sub-role-tree of the role.
Here is how it is done:
RoleServiceProxy rsvc = new RoleServiceProxy();
var ra = rsvc.LoadEntityHierarchyRecurs(cctx, r, 0, -1);
All hierarchical sets have additional API methods to handle entity hierarchy related operations (see the documents for the service API for other methods). Method LoadEntityHierarchyRecurs
loads a part of a tree starting from a node with given maximum relative height (against the starting node) and maximum relative depth and returns the root of the loaded sub-tree. It takes
four parameters. The second parameter is a starting node, the third parameter is the maximum relative height and the fourth one is the maximum relative depth. If value -1 is supplied for the relative height or depth, then all available nodes in the
corresponding direction are loaded. For example for a tree shown in the following diagram, if one start at the red node and call this method with maximum relative height of 1 and maximum relative depth of 1, then the loaded nodes are marked as dark gray.
Figure: Here the red node is the starting node and the dark gray nodes are the tree nodes loaded.
So the above code loads the role sub-tree with the role in question as the local root and calls the recursive method _getUserInRole
to accumulate all users that have an explicit role assignment inside the role sub-tree.
The Profile provider
GetPropertyValues
It is described here because there are other aspects of the service API that should be described here.
public override SettingsPropertyValueCollection GetPropertyValues(
SettingsContext context,
SettingsPropertyCollection ppc)
{
string username = (string)context["UserName"];
bool isAuthenticated = (bool)context["IsAuthenticated"];
SettingsPropertyValueCollection svc = new SettingsPropertyValueCollection();
CallContext cctx = _cctx.CreateCopy();
cctx.OverrideExisting = true;
try
{
User u = isAuthenticated ? findUser(username) : null;
UserProfileTypeServiceProxy uptsvc = new UserProfileTypeServiceProxy();
UserProfileServiceProxy upsvc = new UserProfileServiceProxy();
List<UserProfile> update = new List<UserProfile>();
List<UserProfile> added = new List<UserProfile>();
UserProfileSetConstraints cond = new UserProfileSetConstraints
{
ApplicationIDWrap = new ForeignKeyData<string> { KeyValue = app.ID },
TypeIDWrap = null,
UserIDWrap = new ForeignKeyData<string> { KeyValue = u.ID }
};
var profs = upsvc.ConstraintQuery(cctx, new UserProfileSet(), cond, null);
foreach (SettingsProperty prop in ppc)
{
bool found = false;
if (profs != null)
{
foreach (UserProfile p in profs)
{
if (prop.Name == p.PropName)
{
SettingsPropertyValue pv = new SettingsPropertyValue(prop);
switch (prop.SerializeAs)
{
case SettingsSerializeAs.String:
case SettingsSerializeAs.Xml:
if (!p.IsStringValueLoaded)
{
p.StringValue = upsvc.LoadEntityStringValue(cctx, p.ID);
p.IsStringValueLoaded = true;
}
pv.SerializedValue = p.StringValue;
break;
case SettingsSerializeAs.Binary:
if (!p.IsBinaryValueLoaded)
{
p.BinaryValue = upsvc.LoadEntityBinaryValue(cctx, p.ID);
p.IsBinaryValueLoaded = true;
}
pv.SerializedValue = p.BinaryValue;
break;
default:
break;
}
svc.Add(pv);
update.Add(p);
p.LastAccessTime = DateTime.UtcNow;
p.IsLastAccessTimeModified = true;
found = true;
break;
}
}
}
if (!found)
{
string seras = prop.SerializeAs == SettingsSerializeAs.ProviderSpecific?
SerializationMode.String.ToString() :
prop.SerializeAs.ToString();
var upts = uptsvc.LoadEntityByNature(cctx,
prop.PropertyType.FullName, null, seras);
UserProfileType upt;
if (upts != null && upts.Count > 0)
upt = upts[0];
else
{
upt = new UserProfileType();
upt.ClrType = prop.PropertyType.FullName;
upt.SerializeType = seras;
upt.SerializationProvider = null; }
UserProfile p = new UserProfile();
p.PropName = prop.Name;
p.ApplicationID = app.ID;
p.UserID = u == null ? null : u.ID;
p.TypeID = upt.ID;
p.UserProfileTypeRef = upt;
p.LastAccessTime = DateTime.UtcNow;
p.LastUpdateTime = p.LastAccessTime;
added.Add(p);
SettingsPropertyValue pv = new SettingsPropertyValue(prop);
switch (prop.SerializeAs)
{
case SettingsSerializeAs.String:
case SettingsSerializeAs.Xml:
pv.SerializedValue = p.StringValue;
break;
case SettingsSerializeAs.Binary:
pv.SerializedValue = p.BinaryValue;
break;
default:
pv.SerializedValue = p.StringValue;
break;
}
svc.Add(pv);
}
}
if (added.Count > 0)
upsvc.AddOrUpdateEntities(cctx, new UserProfileSet(), added.ToArray());
if (update.Count > 0)
upsvc.EnqueueNewOrUpdateEntities(cctx, new UserProfileSet(), update.ToArray());
return svc;
}
catch (Exception e)
{
throw e;
}
finally
{
}
}</string>
This method is called by the application to retrieve the value of a set of properties. The set of properties to be returned is specified by the parameter
ppc
. The method compares this list against the ones already registered for the user, the ones not found will be added:
if (!found)
{
string seras = prop.SerializeAs == SettingsSerializeAs.ProviderSpecific?
SerializationMode.String.ToString() :
prop.SerializeAs.ToString();
var upts = uptsvc.LoadEntityByNature(cctx,
prop.PropertyType.FullName, null, seras);
UserProfileType upt;
if (upts != null && upts.Count > 0)
upt = upts[0];
else
{
upt = new UserProfileType();
upt.ClrType = prop.PropertyType.FullName;
upt.SerializeType = seras;
upt.SerializationProvider = null; }
UserProfile p = new UserProfile();
... assign other properties
p.UserProfileTypeRef = upt;
added.Add(p);
... update the value of the property, changed or not ...
}
The interesting part of this code section is in the line p.UserProfileTypeRef = upt
, where upt
is an entity that the UserProfile
entity p
depends upon (see the data schema diagram). The upt
is assigned to the UserProfileTypeRef
property of p
so that it will be added into the UserProfileTypes
set automatically when p
is added. Such a pattern of adding entities that an given entity depends upon can be applied to any entity and it can be applied repeatedly going upper and upper in the dependency graph until the root entities (namely those entity types that do not depend on other ones) are reached. Together with the way how to add dependency entities discussed above, an entire entity graph can in principle be added into the data source by making one call to the service, as a unit of work.
Another lines of code that should be mentioned are
if (added.Count > 0)
upsvc.AddOrUpdateEntities(cctx, new UserProfileSet(), added.ToArray());
if (update.Count > 0)
upsvc.EnqueueNewOrUpdateEntities(cctx, new UserProfileSet(), update.ToArray());
The added
list contains the UserProfile
records to be added into the corresponding set, they are added by calling the AddOrUpdateEntities
method that we already know. The update
list contains the list of properties that already exist inside the corresponding set, they are updated because a UserProfile
entity has a LastAccessTime
that has to be updated each time the property is read. Since there could be many calls to the GetPropertyValues
method from a user, only the last access really counts, these property updates are handled by EnqueueNewOrUpdateEntities
method. What it does is to store the updates inside an internal queue and submits updates to the corresponding data set only in a future time. If there is another calls to it while it is waiting for the submission, for each new entities in the updates, it will check to see if there is already an entity having the same intrinsic ids inside the queue waiting to be submitted. If there is one, the changes to the existing one will be merged with the new one, the existing one is then replaced by the new one. If not, it will be appended to the queue. Multiple and repetitive update calls to the backend data source could be avoided by calling this method.
Other methods are not going to be talked in any more details here since much of what is needed to know about how to work with the service API are already mentioned.
A reader may go directly into the source codes, especially the ones inside the test project, to dig more detailed information, with the help of the intellisense of Visual Studio and client API documents.
The Test Project
The test requires some configuration before it can be run. There is the app.config
file inside the test project in which the data service end points are not fully
initialized. The parameter __servicedomain__
should be replaced by the domain name (and port if not 80) where the data service is hosted.
Note: do not test against an instance of the data service where multiple agents, including other test agents, might be accessing it at the same time. This is required not because the service can not handle CRUD operations by multiple users but because the test codes reset (not locked) the state of the data source constantly so the results for other
agents and the present test agent are unpredictable if out of sequence resets by other agents are going on.
The included solution for test projects can be used either to test the providers, but also be used as source code to have a alternative view point to study providers, their functionalities and the data service. This is because they cover far more details than the one contained in the following demo website.
The Demo Website
The solution for the demo website is for an ASP.NET MVC 4 site.
It has four tabs. The Home and About tabs can be viewed by all users. The Contact tab can be viewed by users having Administrators
role. The Admin tab can be viewed by user having Administrators.System
role.
public class HomeController : BaseController
{
... other actions ...
[Authorize(Roles="Administrators")]
public ActionResult Contact()
{
ViewBag.Message = "Your contact page.";
return View();
}
... other actions ...
}
and
[Authorize]
public class AccountController : BaseController
{
... other actions ...
[HttpGet]
[Authorize(Roles = "Administrators.System")]
public ActionResult Admin()
{
return View();
}
... other actions ...
}
The data service has two users setup. Login using one of their credentials to check the effect of role based Authorization. Register new users using the demo site. Add, roles and assign roles if you want using the data service manager.
If you are using the online demo service, new data added to the remote data service can not be saved, they will be forgotten as soon as the service is reloaded. For local hosted demo service, the folder "App_Data\AspNetMember\Data" under the service site need to have proper permission in order to save data. Namely, the said folder should assign a Write permission to user IIS_IUSRS
.
Also note that the "Scripts\DbViewModels\AspNetMember" sub-directory on the service site contains a full list of knockout.js view models that one can use to add administrative content to the demo website or other website using the providers. You can, e.g., build your own more customized role and role assignment interfaces.
Using the Custom Providers
The custom providers should be configured to be used on a ASP.NET website.
Configuration
First of all, the machine key values should be re-generated, do not use the ones inside the demo
web.config file.
To use the custom providers, the following sections must be added to (or replaced from) the site
web.config file.
<system.web>
...
other system.web settings
...
<membership defaultProvider="DefaultMembershipProvider">
<providers>
<clear/>
<add name="DefaultMembershipProvider"
type="Archymeta.Web.Security.AspNetMembershipProvider,
AspNetUserServiceProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"
passwordFormat="Hashed"
enablePasswordRetrieval="false"
enablePasswordReset="true"
requiresQuestionAndAnswer="false"
requiresUniqueEmail="true"
maxInvalidPasswordAttempts="5"
minRequiredPasswordLength="6"
minRequiredNonalphanumericCharacters="0"
passwordAttemptWindow="10"
passwordStrengthRegularExpression=""
writeExceptionsToEventLog="false"
applicationName="demo" />
</providers>
</membership>
<roleManager defaultProvider="DefaultRoleProvider" enabled="true">
<providers>
<clear/>
<add name="DefaultRoleProvider" type="Archymeta.Web.Security.AspNetRoleProvider,
AspNetUserServiceProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"
writeExceptionsToEventLog="false"
applicationName="demo" />
</providers>
</roleManager>
<profile defaultProvider="DefaultProfileProvider">
<providers>
<clear/>
<add name="DefaultProfileProvider"
type="Archymeta.Web.Security.AspNetProfileProvider,
AspNetUserServiceProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"
connectionStringName="DefaultConnection"
applicationName="demo" />
</providers>
<properties>
<add name = "FirstName"/>
<add name = "LastName"/>
<group name = "SiteColors" >
<add name = "BackGround"/>
<add name = "SideBar"/>
<add name = "ForeGroundText"/>
<add name = "ForeGroundBorders"/>
</group>
<group name="Forums">
<add name = "HasAvatar" type="bool" />
<add name = "LastLogin" type="DateTime" />
<add name = "TotalPosts" type="int" />
</group>
</properties>
</profile>
</system.web>
The attributes inside each provider node are read by the corresponding "Initialize" method of the corresponding provider.
The can be used to control the behavior of the corresponding provider. The <properties>
node of the profile section defines the name,
type, and serialization (and other meta info) of each properties that a user can get or set. The above list are just examples, a reader should set their own set
of properties to define user profiles. Note the <roleManager/>
section node should has an "enabled" attribute
that has to be explicitly set to "true" in order for the role provider to be invoked, otherwise the results are un-predictable.
One should setup the site as a client of the data service as well. The following is a set of
the basic settings:
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name="basicHttpBinding_DataService"
allowCookies="true" maxBufferSize="6553600"
maxBufferPoolSize="5242880"
maxReceivedMessageSize="6553600" >
<security mode="None" />
</binding>
</basicHttpBinding>
</bindings>
<behaviors>
<endpointBehaviors>
<behavior name="ImpersonateEndpointBehavior">
<clientCredentials>
<windows allowedImpersonationLevel="Delegation"
allowNtlm="true" />
</clientCredentials>
</behavior>
</endpointBehaviors>
</behaviors>
<client>
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/AspNetMemberDatabase2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IAspNetMemberService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/Application_Set2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IApplication_Service2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/RoleSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IRoleService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/UserAppMemberSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IUserAppMemberService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/UserProfileSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IUserProfileService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/UserProfileTypeSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IUserProfileTypeService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/UserSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IUserService2" />
<endpoint name="HTTP"
address="http://_domain_/Services/DataService/AspNetMember/UsersInRoleSet2.svc"
binding="basicHttpBinding"
bindingConfiguration="basicHttpBinding_DataService"
contract="CryptoGateway.RDB.Data.AspNetMember.IUsersInRoleService2" />
</client>
</system.serviceModel>
Here, _domain_
inside the address attribute of each endpoint node represents the domain name or IP address (plus port number if not 80), one should change them to a proper one pointing to the server on which data service is hosting.
The <connectionStrings>
section of the Web.config file contains a name="DefaultConnection" add node, it initially points to a ASP.NET created membership database. Your can remove the node if you have no further use of it or change the content of it to point to other database that the site is going to use.
Changing the default project
A developer most likely creates a ASP.NET MVC website or web application using a templates provided by Visual Studio. For pre-ASP.NET MVC 4.0 web applications, the above change will be enough to change the solution to use the present providers.
For default ASP.NET MVC 4.0 web applications, there are more to change. This is because the default membership provider inside the generated solution is SimpleMembership
(see here) one, which is not derived from the MembershipProvider
base class, but from a new ExtendedMembershipProvider
class that includes support for OAuth access API. The generated AccountController depends on static methods
of WebSecurity
class inside the WebMatrix.WebData assembly to call the provider APIs, they are not compatible with providers derived from MembershipProvider
base class.
For ASP.NET MVC 4.0, the generated AccountController
requires many changes to be usable here. However one simple solution is to create an ASP.NET MVC 3.0 web application and copy the content of the AccountController
class to overwrite the MVC 4.0 one. Then one can remove the
InitializeSimpleMembershipAttribute.cs file under the Filters folder from the project since current providers do not depend on Entity Framework to access our data service. Also to be removed are project references
to WebMatrix.Data.dll and WebMatrix.WebData.dll, they are no longer needed.
Setup the Data Service
Extract the files from the Member Data Service package to a folder, configure a website for it (it is a ASP.NET MVC 4 web application).
Enable the HTTP activation of WCF inside your system. That's basically it.
If you need to persist your changes, at least
the "App_Data\AspNetMember\Data" sub-directory under the service site need to have proper permission. It should allow user IIS_IUSRS
the Write permission.
The data service comes with a pre-loaded sample data sets for demonstration purposes. The service itself has a web-front end using which a user can add, delete or query data. If one needs to manipulate the data using the web-front end, please read the corresponding section here for detailed instructions. These instructions can also be applied to the current data service.
Warning: This is a demonstration version of the system for evaluation purposes. It's based on a
transient serialization method. Please do not used it for long time data persistence or for supporting production systems.
Points of Interest
Service under Linux
Some efforts had been made trying to have the data service to run under a linux box inside Mono (version 3.0.3.1) (xsp + mono). The following package was built using the latest MonoDevelop (which is itself compiled from the git source). The data service site can now be browsed, which means that the Asp.Net MVC 4 part seems to be ok for the service. Thanks to the great work that had been done by peoples involved in that project.
There still seems to be a lot of hard work for them to do in order to make the WCF part to work as the one in .Net WCF. It is hoped that the findings here could provide information for them to make Mono a better platform for hosting services.
Our experiment shows that webHttpBinding (RESTful + json) part of Mono WCF does not allow calls to service methods having more than one parameters. So most part of the management pages of the data source can not be used as expected when hosting the service inside a linux box.
The basicHttpBinding part of Mono WCF works better. But it seems to be still not quite there. When running our test suit against a serviec hosted on a linux server, about half of the tests will fail. Detailed investigations show that a lot of the problems are related to data serialization, namely some of the object graphs returned are not serialized as expected at least when a Windows .Net client program is used. For example, by using debuggers on both service side and client side to compare, we found that 1) for some entity graphs, child objects that are on the service site before sending are missing on the client side; 2) member properties not marked using DataMember
attribute are serialized, resulting in a much large object graph be returned; etc..
Northwind database migrated
Microsoft Northwind database was migrated to our demo in-memery database for demonstration purposes. Here is an online demo.
New kind of search means
StackExchange.com provides periodic data dump of its data in xml format. A relational data schema was inferred from the data sets and a read only data service was built for them. The service is currently attached to a PostgreSQL database engine featuring both native and unified full-text search expressions that can be combined with other meta data filtering in an arbitrary querying expression. Interested reader can visit demo site A (which contains about 440 thousands Q/As) for serverfault.com and demo site B (which contains about 50 thousands Q/As) for gis.stackexchange.com. The service UI allows a user to find, sort and study data in much more accurate and flexible ways.
Demo portal for service based relational data + keywords + dynamical categorization search portal is online, see here.
History
- Version 1.0: Initial publication.
- Version 1.1: Overall system updates for supporting data migration amongst heterogeneous databases (having the same data schema) sub-system.
- Version 1.1.1: Significantly improved the intelligence and performance of data import and data sync/migration sub-systems. Minor API changes were made.
- Version 1.2: Both native and unified full-text indexing and search are supported for those tables in a database that require them. Overall upgrade of the demo site (libraries, styles, etc.) was done, it is under foundation now.
- Version 1.2.1: Bug fix to the providers. Added data annotation to data models. Improved documents.
- Version 1.2.2: Incorrect behavior when adding or updating items due to a bug in service javascripts is corrected. More views are added. Only the package for the service is changed.
- Version 1.2.5: Overall system refinements, updates, bug fixes, performance tunes, new addition to the API and updated documents.
- Version 1.2.6: Overall system updates, enhancements.
- Version 1.3.0: Accumulated system updates, enhancements. Asynchronous service proxy API for post version 4.0 of .Net framework added. Added documents for async API. Asynchronous membership stores for post .Net 4.0 version of Asp.Net published.
- Version 1.5.0: The data service now run under .NET 4.5.1 and ASP.NET MVC 5 with many features improved and new features, like support of SignalR or WCF based entity change events subscription ports, etc., added.