Tuesday, September 30, 2008

Reading SharePoint Lists into an ADO.Net DataTable

[Feb 18, 2009: I've posted an update to show the newer technique suggested below by Kirk Evans, also compensating for some column naming issues.]

The other day, I needed to write some code that processed data from a SharePoint list. The list was hosted on a remote MOSS 2007 server.

Given more time, I'd have gone digging for an ADO.NET adapter, but I found some code that helped. Unfortunately, the code I found didn't quite seem to work for my needs. Out of the box, the code missed several columns for no apparent reason.

Here's my tweak to the solution:

(The ListWebService points to a web service like http://SiteHost/SiteParent/Site/_vti_bin/lists.asmx?WSDL )

private data.DataTable GetDataTableFromWSS(string listName)


{


ListWebService.Lists lists = new ListWebService.Lists();


lists.UseDefaultCredentials = true;


lists.Proxy = null;



//you have to pass the List Name here


XmlNode ListCollectionNode = lists.GetListCollection();


XmlElement List = (XmlElement)ListCollectionNode.SelectSingleNode(String.Format("wss:List[@Title='{0}']", listName), NameSpaceMgr);


if (List == null)


{


throw new ArgumentException(String.Format("The list '{0}' could not be found in the site '{1}'", listName, lists.Url));


}


string TechListName = List.GetAttribute("Name");


data.DataTable result = new data.DataTable("list");


XmlNode ListInfoNode = lists.GetList(TechListName);


System.Text.StringBuilder fieldRefs = new System.Text.StringBuilder();


System.Collections.Hashtable DisplayNames = new System.Collections.Hashtable();


foreach (XmlElement Field in ListInfoNode.SelectNodes("wss:Fields/wss:Field", NameSpaceMgr))


{


string FieldName = Field.GetAttribute("Name");


string FieldDisplayName = Field.GetAttribute("DisplayName");


if (result.Columns.Contains(FieldDisplayName))


{


FieldDisplayName = FieldDisplayName + " (" + FieldName + ")";


}


result.Columns.Add(FieldDisplayName, TypeFromField(Field));


fieldRefs.AppendFormat("", FieldName);


DisplayNames.Add(FieldDisplayName, FieldName);


}


result.Columns.Add("XmlElement", typeof(XmlElement));


XmlElement fields = ListInfoNode.OwnerDocument.CreateElement("ViewFields");


fields.InnerXml = fieldRefs.ToString();


XmlNode ItemsNode = lists.GetListItems(TechListName, null, null, fields, "10000", null, null);


// Lookup fields always start with the numeric ID, then ;# and then the string representation.


// We are normally only interested in the name, so we strip the ID.


System.Text.RegularExpressions.Regex CheckLookup = new System.Text.RegularExpressions.Regex("^\\d+;#");



foreach (XmlElement Item in ItemsNode.SelectNodes("rs:data/z:row", NameSpaceMgr))


{


data.DataRow newRow = result.NewRow();


foreach (data.DataColumn col in result.Columns)


{


if (Item.HasAttribute("ows_" + (string)DisplayNames[col.ColumnName]))


{


string val = Item.GetAttribute("ows_" + (string)DisplayNames[col.ColumnName]);


if (CheckLookup.IsMatch((string)val))


{


string valString = val as String;


val = valString.Substring(valString.IndexOf("#") + 1);


}


// Assigning a string to a field that expects numbers or


// datetime values will implicitly convert them


newRow[col] = val;


}


}


newRow["XmlElement"] = Item;


result.Rows.Add(newRow);


}


return result;


}



// The following Function is used to Get Namespaces




private static XmlNamespaceManager _nsmgr;


private static XmlNamespaceManager NameSpaceMgr


{


get


{


if (_nsmgr == null)


{


_nsmgr = new XmlNamespaceManager(new NameTable());


_nsmgr.AddNamespace("wss", "http://schemas.microsoft.com/sharepoint/soap/");


_nsmgr.AddNamespace("s", "uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882");


_nsmgr.AddNamespace("dt", "uuid:C2F41010-65B3-11d1-A29F-00AA00C14882");


_nsmgr.AddNamespace("rs", "urn:schemas-microsoft-com:rowset");


_nsmgr.AddNamespace("z", "#RowsetSchema");



}


return _nsmgr;


}


}


private Type TypeFromField(XmlElement field)


{


switch (field.GetAttribute("Type"))


{


case "DateTime":


return typeof(DateTime);


case "Integer":


return typeof(int);


case "Number":


return typeof(float);


default:


return typeof(string);


}


}

Thursday, September 25, 2008

The Great Commandment

While I was writing a post the other day, I noticed that I had neglected a topic that I find very important in software development. Risk management.

There are only a few guarantees in life. One of them is risk. Companies profit by seizing the opportunities that risks afford. Of course, they suffer loss by incidents of unmitigated risks. All our government and social systems are devices of risk management. In business, risk management is (now, and ever shall be) the great commandment.

Many software engineers forget that risk management is not just for PM’s. In fact, software and its development is fundamentally a tool of business, and, by extension, risk management. The practice of risk management in software really extends in to every expression in every line of source code.

Don’t believe me? Think of it this way… If it wasn’t a risk, it would be implemented as hardware. I've often heard hardware engineers say that anything that can be done in software can be done in hardware, and it will run faster. Usually, if a solution is some of the following…
· mature,
· ubiquitous,
· standard,
· well-known,
· fundamentally integral to its working environment

…it is probably low risk, particularly for change. It can likely be cost-effectively cast in stone (or silicone). (And there are plenty of examples of that… It’s what ASIC’s are all about.)

Software, on the other hand, is not usually so much of any of those things. Typically, it involves solutions which are…
· proprietary,
· highly customized,
· integration points,
· inconsistently deployed,
· relatively complex / error-prone
· immature or still evolving

These are all risk indicators for change. I don’t care what IT guys say… software is much easier to change than logic gates on silicone.

I’ve dug in to this in the past, and will dig in more on this in future posts, but when I refer to the “great commandment”, this is what I mean.

Application Platform Infrastructure Optimization

In doing some research for a client on workflow in SharePoint, I came across this interesting article about the differences between BizTalk 2006 and the .NET Workflow Foundation (WF).

The article itself was worth the read for its main point, but I was also interested in Microsoft's Application Platform Infrastructure Optimization ("APIO") model.

The "dynamic" level of the APIO model describes the kind of system that I believe the .NET platform has been aiming at since 3.0.

I've been eyeing the tools... between MS's initiatives, my co-workers' project abstracts, and the types of work that's coming down the pike in consulting. From the timing of MS's releases, and the feature sets thereof, I should have known that the webinars they've released on the topic have been around for just over a year.

This also plays into Microsoft Oslo. I have suspected that Windows Workflow Foundation, or some derivative thereof, is at the heart of the modeling paradigm that Oslo is based on.

All this stuff feeds into a hypothesis I've mentioned before that I call "metaware", a metadata layer on top of software. I think it's a different shade of good old CASE... because, as we all know... "CASE is dead... Long live CASE!"

Monday, September 15, 2008

facebook

I've been avoiding the whole My Space / Facebook thing for a while now... but now I'm checking it out. A little part of me is afraid that a public ill-prepared for the communications onslaught of web 2.0 toys like Facebook will fall prey to it. It may lead to implants that allow people to have every thought cataloged for later analysis. Before you know it, we'll all be Assimilated! (Resistance is Futile!)

Sunday, September 14, 2008

Champions of Disruption

I've been noticing lately that truely interesting things only happen on the "edge". Everything is energy, and everything happens at the point where energy flows are disrupted.

If you don't believe me, just ask Mother Nature. Take solar energy. Powerful energy flows from our sun and saturates our solar system... but all the amazing things happen where that energy flow is disrupted. The Earth disrupts it, and the result, in this case, is merely life as we know it.

It's so primal that we've abstracted the concept of energy flows, and call it (among other things) currency. When we sell a resource (a form of energy, in a sense), we even call that change "liquidation".

Sure, potential energy has value, but there are no edges in a region of potential energy. Potential energy is usually static, consistent, and only really exciting for what it could do or become, rather than what it currently is.

Likewise, it's where disruptions occur that there's business to be done.

According to this article on Information Week, CIO/CTO's appear to have generally become change-averse order takers. Surveys cited indicate that many shops are not actively engaged in strategy or business process innovation.

Perhaps they're still feeling whipped by the whole "IT / Business Alignment" malignment. Maybe they're afraid of having business process innovation through technology innovation come off as an attempt to drive the business. Ultimately, it seems many are going into survival mode, setting opportunity for change asside in favor of simply maintaining the business.

Maybe the real challenge for IT is to help business figure out that innovation is change, and change is where the action is.

In any case, it seems there's a lot of potential energy building up out there.

The disruptions must come. Will you be a witness, a victim, or a champion of them?

Saturday, September 13, 2008

Retail IT in the Enterprise

Lately, the projects I've been on have had me taking on roles outside my comfort zone. (I'm not talking about downtown-Boston... with the "Boston Express" out of Nashua, I'm ok with that.)

I've always been most comfortable, myself, in cross-discipline engineering roles, especially in smaller teams where everyone's got good cross-discipline experience. The communications overhead is low. The integration friction is low. Everyone knows how it needs to be done, and people are busy building rather than negotiating aggressively.

These types of tight, focused teams have always had business focused folks who took on the role of principal consultant. In this type of situation, the principal consultant provides an insulation boundary between the technical team and the customer.

This insulation has made me comfortable in that "zone": I'm a technologist. I eat, sleep, dream software development. I take the ability to communicate complex technical concepts with my peers effectively and concisely, very seriously.

So like I said, lately the projects I've been on have yanked me pretty hard out of that zone. I've been called on to communicate directly with my customers. I've been handling item-level projects, and it's a different world. There is no insulation. I'm filling all my technical roles, plus doing light BA and even PM duty.

Somewhat recently, I emailed a solution description to a CFO. The response: "Send this again in user-level English."

It killed me.

I've gotten so used to having others "protect" me from this sort of non-technical blunder. In contemporary projects, the insulating consulting roles are simply not present.

Makes me wonder about the most important lessons I learned during my school days... In high school days, maybe it was retail courtesy, and retail salesmanship in a technical atmosphere ("Radio Shack 101"). In college days, the key lessons might have been how to courteously negotiate customer experience levels, (from "help desk 101").

Wednesday, September 10, 2008

Compact and Full .NET Frameworks

One of the things I've been intrigued by for a while now is the fact that code compiled for the .NET Compact Framework (all versions) executes very nicely on the full .NET Framework.

For example, my personal hobby project, "Jimmy Sudoku", is written in C# for the .NET Compact Framework 2.0. There are actually two install kits. The first is a .CAB file for Windows Mobile devices. The second is an .MSI for Windows 9x, XP, and Vista. The desktop install kit even serves two purposes. First, it installs the program on the desktop. Second, it leverages ActiveSync to push the .CAB up to the Windows Mobile device.

It's a .NET Compact Framework app especially for Windows Mobile devices, but many 'Jimmy' fans don't have a Windows Mobile device to run it on.

The coolest part is the ease in which all of the components inter-operate. The .EXE and .DLL's that are delivered to the mobile device are the very same as the ones that are delivered to the desktop. Like Silverlight to WPF, the Compact Framework is a compatible subset of the full framework, so interoperability is a given.

Even better, you can reference CF assemblies in Full framework assemblies. One immediate offshoot of this in my hobby project... the web service I built to service "Game of the Day" requests actually references the CF assembly that implements the game state model and game generator code. The assembly that generates games on Windows Mobile PDA's & cell phones is the very same assembly that generates games in the ASP.NET web service.

Admittedly, there are some bothersome differences between the CF and the Full .NET Framework. The CF does not support WPF. The CF has no facilities for printing. Also, while the CF does supports some of the common Windows Forms dialogs, it does not support File Save and File Open dialogs on Windows Mobile Standard Edition (Smart Phone / non-touchscreen) devices.

These differences can be overlooked to some extent, though, for the fact that one compiled assembly can execute on so many very different machine types. Further, with interoperability, one can extend a CF-based core code with full-framework support. For example, I'm currently playing with desktop print functionality for my hobby project.

Something that I'd really love to see, some day, is a good excuse to develop a Windows Forms app for a client that had shared components between the desktop and a mobile.

I can imagine that this model would be superb for a huge variety of applications, allowing a fully featured UI for the desktop version, and an excellent, 100% compatible, very low risk (almost "free") portable version.


I've often thought this would work great for apps that interface hardware, like:
field equipment,
mobile equipment,
vehicles of all sorts,

...simply plug in your PDA (via USB or Bluetooth), and it becomes a smart management device for the equipment, using the very same code that also runs on the desktop.

Thursday, September 4, 2008

Semi-IT / Semi-Agile

While working on-site for a client, I noticed something interesting. On the walls of some of my client's "users" offices, along with other more classic credentials, are certifications from Microsoft... SQL Server 2005 query language certifications.

I've heard a lot about the lines between IT and business blurring. We talk a fair amount about it back at HQ.

Interestingly, this case is a clear mid-tier layer between classic IT (app development, data management, advanced reporting) and business in the form of ad hoc SQL querying and cube analysis. In many ways, it's simply a "power-user" layer.

The most interesting part about it is the certification, itself. The credentials that used to qualify an IT role are now being used to qualify non-IT roles.

Another trend I'm seeing is development ceremony expectations varying depending on the risk of the project. Projects that are higher risk are expected to proceed more like a waterfall ceremony. Lower risk projects proceed with more neo-"agility".

The project I was on was apparently considered "medium" risk. The way I saw this play out was that all of the documentation of a classic waterfall methodology was expected, but the implementation was expected to develop along with the documentation.

In many ways, it was prototyping into production. Interestingly, this project required this approach: the business users simply did not have time to approach it in a full waterfall fashion. Had we been forced into a full-fledged classic waterfall methodology, we might still be waiting to begin implementation, rather than finishing UAT.

Power and Control: Fusion Report 13 June 008

Power and Control: Fusion Report 13 June 008

Thanks for the report, M. Simon. My post about Energy Productization was actually a bit of a response to an article I read that covered both the WB7 and ITER. Your report hits the points much more concisely.

Tuesday, September 2, 2008

Energy Productization

I think the ITER project, a grand-scale fusion project, is interesting. I'm troubled with it for a few reasons, though. I can't help but think that there's only a few reasons that we need it, and most of them have to do with power... of a controlling nature.

There's already a great big huge fusion reactor throwing more energy at us than we can collect, let alone use... every single day... the sun.

Efforts to create fusion reactors here on earth are great for these reasons:
1) Energy for space exploration
2) Productization of energy
3) Weapons innovation

Once we learn how fusion's done, we can build space craft from it that could potentially get us somewhere in the galaxy. That's all well and good, but will it happen before we poison our existing biosphere?

Once we have fusion reactors, energy moguls can sell it. Oh, great! Instead of energy productization through oil, we get energy productization through fusion... because we can't all have one of these great big huge reactors in our basements. At least it's renewable. If the moguls are benevolent, it might even be cheap.

Finally, once we have fusion reactors like this, we'll learn new ways to blow ourselves out of our collective misery... so I suppose the first two points are mute once this comes along.

Monday, September 1, 2008

Economic Detox

While contemporary headlines bode poorly for the U.S. economy, I see them as signs of hope...

I keep hearing high-pitched alarms about the weakening U.S. dollar, inflation, energy prices, the housing market bubble burst. We all see the ugly face of the these conditions.

Global trade has been a bitter (but necessary) pill for the U.S. Perhaps the Clinton-detonated U.S. economic nuclear winter (of global trade, NAFTA, etc.) is finally starting to give way to a new economic springtime in the States.

In the late 90's US market, there were a lot of excesses in the technology sector. Then the bubble burst. When the dust settled, we (the US IT industry) found ourselves disenfranchised by our sponsors... corporate America beat us with our own job hopping. U.S. Engineers hopped off to the coolest new startup, and rode their high salaries into the dirt, while enduring companies went lean, mean, and foreign. We had become so expensive, we were sucking our own project ROI's completely out of sight. By hooking foreign talent pools, the ROI's were visible again.

Nearly a decade later, look what's happening around the world... Many foreign IT job markets are falling into the same salary inflation trap that the U.S. market fell into... They are going through the same inflation we experienced. Their prices are rising.

Combine their salary inflation with our salary stagnation and a weakening dollar, and what do you get?

A leaner, meaner domestic competitor.

In a sense, it's like that in many sectors of the U.S. economy.

So let the U.S. dollar weaken... It means that America can go back to being product producers (rather than mindless consumers) in the global market!