Wednesday, June 29, 2011

Microsoft Community Contributor Award

Received my Microsoft Community Contributor award for my contributions to Microsoft online technical communities this morning.



Thanks to Microsoft Community and online support team for recognizing community contributors.

Tuesday, June 7, 2011

Querying calendar list by meeting workspace Url

Calendar list doesn't support querying the list by a workspace url making it a daunting task for finding a recurrence meeting workspace URL in Calendar list. U2U CAML editor and SharePoint views are unable to break this nut.

The trick to accomplish is to pass the relative URL for the CAML segment for querying. Following code snippet helps better understanding.

//removing the Server hostname and protocol, eg:- stripping http://Foo from http://Foo/meetingsite
string hostServer = meetingWeb.Site.Protocol + "//" + meetingWeb.Site.HostName +":" + meetingWeb.Site.Port;
searchUrl = searchUrl.Replace(hostServer, string.Empty);
SPQuery qry = new SPQuery();
qry.Query = @"<Where><Eq><FieldRef Name='Workspace' />
 <Value Type='URL'>" + SPEncode.UrlEncodeAsUrl(searchUrl) + "</Value>" +"</Eq></Where>";
qry.ViewFields = @"<FieldRef Name='Title' />
              <FieldRef Name='Location' />
              <FieldRef Name='EventDate' />
              <FieldRef Name='EndDate' />
              <FieldRef Name='fAllDayEvent' />
              <FieldRef Name='fRecurrence' />";
DataTable tbl= list.GetItems(qry); 

Sunday, June 5, 2011

CrossListQueryInfo and CrossListQueryCache gotchas

CrossListQueryInfo class which comes with the Microsoft Publishing infrastructure makes use of object caching techniques to query the requested data from the cache instead from the database. Remember to use this facility only if your code runs against a MOSS server and not on a WSS server. As this makes use of  object cache stored in WFE servers, it avoids to database input/output and network latency between the WFE's and database server.

Although it uses SPSiteDataQuery internally,the very first request using CrossListQueryCache takes more time than SPSiteDataQuery. Practically it includes the time for querying, caching and returning the values. But subsequent queries fetch results in a flash and no where comparable with SPSiteDataQuery.

Interesting fact is that this CrossListQueryCache and QueryInfo will not work in a Console line tool, it badly depends on the context. If you are really trying this interesting query technique in a web part, whenever you implement any change in your webpart and trying to debug. Apart from doing the IISReset/App pool recycle, don't forget to flush the object cache

A sample CrossListQueryCache
public DataTable GetDataTableUsingCrosslistQuery(SPSite site)
{
CrossListQueryInfo clquery = new CrossListQueryInfo();
clquery.RowLimit = 100;
clquery.WebUrl = site.ServerRelativeUrl;
clquery.UseCache = true ;
clquery.Lists = "";
clquery.Webs = "";
//clquery.Query = "";
clquery.Query = "569";
clquery.ViewFields = "";
CrossListQueryCache cache = new CrossListQueryCache(clquery);
DataTable results = cache.GetSiteData(site);
return results; 
}

Tuesday, May 10, 2011

Multiple NT authentication prompts in MOSS 2007 Server

In our team, people usually build their own development machines from the scratch and recently faced this multiple authentication prompt issue while accessing central administration right after the PSConfig wizard execution.

We tried all the browser tricks
  • switching off the IE Enhanced security configuration
  • Adding the site to IE's Local intranet zone
  • Checking the IE cookie storage setting and Authentication settings
Nothing worked, still there were multiple  authentication prompts and Fiddler showing many HTTP 401 requests. Even without providing the username and password, by pressing the ESC key continuously we were able to reach the Central administration home page. But none of the Ok/Cancel button worked as expected and leaving the Central admin console useless.

Finally Manimaran,  found a  Microsoft's KB article 95271, which clearly says you guys might be trying to install IIS 6.0 on top of a Windows Server 2003 SP2.

The machine was patched with Windows Server 2003 SP2 and we installed IIS 6.0 using an RTM disc which apparently reverted some libraries to its older version especially asp.dll. Replacing only this library didn't workout and I don't want to re-install OS Service Pack 2 again.

Workaround without re-installing Win 2003 SP2:
  • Keep Win 2003 SP1 and SP2 installation media handy in a drive.
  • Extract I386 content of these service packs using an archive tool, mine is 7zip
  • Uninstall SharePoint server and delete the content databases in database.
  • Remove the machines application server role and Uninstall IIS 6.0
  • Add Application server role , Install IIS 6.0 
  • Whenever the installer prompts for files to install point to the folder where you extracted SP2
  • In case if you don't find the files in SP2 folder, drill down in SP1 folder else in RTM disc.
Thats it now start installing SharePoint now.

Wednesday, April 20, 2011

Continuous Integration and remote deployment for SharePoint

This is a short guide of setting up a Continuous integration server and integrating with different plugins to analyse the code quality. At the end of the pipeline, we'll deploy to a remote staging server. The core idea is to build the solution on top of open source stack rather than  proprietary solutions like TFS.

Setting up Continuous Integration server:

  • Although there are tons of CI tools available in the market,as CruiseControl.Net looked for great for integrating various tools we picked up CruiseControl for our implementation.
  • Install CruiseControl.Net on a designated build server
  • CC.NET supports various source control blocks CVS,Subversion,VSS,SourceGear, StarTeam etc.
  • Download a command line client for the source control repository, for Subversion download this CollabNet Subversion command line client
  • Upon successful installation, browse to the folder C:\Program Files\CruiseControl.NET and open ccnet.config file in a text editor.
  • Add the project level detail

<project name=" testProject">
<workingDirectory>C:\develop\project1WorkingDir </workingDirectory>
</project>
  • Add source control block, which contains the SVN repository end point and credentials required to access these resources.
<sourcecontrol type="svn">
<trunkUrl>svn://Foo/trunk</trunkUrl>
<Working Directory>C:\develop\project1WorkingDir </workingDirectory>
<username>ccnet </username>
<password> ccnet </password>
</sourcecontrol>
  • Now we've added a Subversion URL with required credentials to access 
  • Let's add a Trigger block to tell the integration server at what frequency it should get latest and do a build. set the frequency as 360 seconds.
<triggers>
<intervalTrigger name="Subversion" seconds="360" buildCondition="ForceBuild" />
</triggers>
  • Add a Task block for building the solution, CC.net supports various build automation engines such as MSBuild, VSDevenv,NAnt  etc. We'll configure for MSBuild.
<tasks>
<msbuild>
<executable>C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\MSBuild.exe</executable>
<WorkingDirectory>C:\develop\project1WorkingDir</workingDirectory>
<projectFile>Sampleprojects.sln</projectFile >
<executable>
</tasks>
  • Alternatively if you want to use Visual studio to build the solution, following fragment will be helpful
<devenv>      
<solutionfile>C:development\Foo.sln</solutionfile>
        <configuration>Debug</configuration>
        <buildtype>Clean</buildtype>
        <executable>C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\devenv.exe</executable>        <buildTimeoutSeconds>360</buildTimeoutSeconds>
      </devenv>
  •  Add an Executable Task in this section after the build task for integrating other third party analysis tools like Microsoft FxCop for static code analysis, SPDisposeCheck tool for memory leak test, Microsoft StyleCop etc.
</exec>
<exec executable="C:\Program Files\Microsoft\SharePoint Dispose Check\SPDisposeCheck.exe">      <buildArgs>C:Foo\bin</buildArgs>
</exec>
  • Above Exec snippet requires all binaries to be stored in a single folder to run SPDisposeCheck, this can be done by adding a post build command in CS Project files.
  • Integrate Unit test case execution tools and code coverage tools like MSTest and NUnit test.
Package and move to staging server
  • Add a Exec task tag for WSP Builder
<exec executable="C:\Program Files\WSPTools\WSPBuilderExtensions\WSPBuilder.exe"><baseDirectory>C:\Foo\solutions</baseDirectory>      <buildTimeoutSeconds>360</buildTimeoutSeconds></exec>
  • Add a task which executes a xcopy command to move all these wsp packages to a staging server
<exec executable="C:\WINDOWS\system32\xcopy.exe">
<buildArgs>C:\Foo\Solutions\Output.WSP\*.wsp \\md-stagingSvr\share</buildArgs></exec>
Remote deployment with Sysinternal's PSExec

  • Add  another exec task to invoke PSExec tool to do the remote deployment
  • Alternatively if you have Windows Powershell in these machines you can make use of power shell to do the remote deployment.
<exec executable="C:\SysinternalsSuite\psexec.exe">
<buildArgs>\\md-stagingSvr-u "md-stagingSvr\aravind" -p 123$ "C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\stsadm.exe" -o addsolution -filename c:\share\output.wsp</buildArgs></exec>


Saturday, March 26, 2011

Notes from Tech Ed 2011 India - 2

TFS: Continuous Global Delivery, presented by Vinay Badami and following notes were taken from the session.

  • Supports client installations- Windows Vista, Windows 7
  • Supports 64 bit architecture
  • Supports Java development within Eclipse(using Team Explorer Everywhere 2010)

Build Automation

  • Integrate early, integrate often
  • Continuous integration
  • Builds on every Check-in
  • Gated check-in - Only if build succeeds the code gets checked-in to the source code repository.
  • Rolling builds to control frequency.

Branches and visualization

  • Visualize integration across all branches
  • Track the branches and change set.

Integrates with other version control systems and bug tracking systems using TFS Integration platform

Friday, March 25, 2011

Notes from Tech Ed 2011 India - 1

"Design,Performance and capacity factors for successful intranet and internet SharePoint sites" session was presented by Sanjay Narang from MCS and following notes were taken from the session. All credit goes to Sanjay for enlightening the full house.

RPS- Request per second 
An indicator to express the number of request served by the web farm irrespective of size of the request

Points to note while tracking the load of the server

  1. All Users - Total user base of the application
  2. Active users - Total users who are currently using the application
  3. Concurrent users - Total users who are currently using the same functionality concurrently(Given the nature of HTTP's disconnected architecture this'll be very few)
(Image from Microsoft Technet)


RPS Calculation
  1. Take IIS logs from all the servers in web farm
  2. Take from a peak hour traffic of a day
  3. Use tools like Log parser to query the log files collected.(how to use Log parser ?)
  4. Avoid requests for for image,CSS,JS while calculating RPS.
  5. Avoid Http 401's also
Decision point
  1. Average RPS on a day
  2. Average RPS in  a peak hour
  3. Max RPS in a peak hour
  4. Average daily concurrent users
  5. Peak concurrent users in peak time
Dataset
The size of data stored in a system is called as Dataset, typically in SharePoint world we call it as Content Database and its size

Points consider while planning for Dataset
  1. Content Database size
  2. Temp DB Size
  3. Transaction Log size
  4. Space required for full backup
  5. Number of documents and versions in content db's
  6. Number of metadata associated to the documents
  7. Number of other list items (need to calculate the size of the list item too)
Performance and Reliability
  1. Server availability - Overall uptime of the system
  2. Server responsiveness - A Farm's time taken to serve the request
  3. System resource utilization - CPU utilization and available memory
Boundaries and limits

When you plan for the capacity planning and management, it is essential to keep a tab on the product boundaries and limits
  1. Content Database size - 200 GB
  2. Site collection size  - Max 100 GB
  3. Site Collection per Content database - 5000 Max/2000 recommended
General recommendation of Servers
  • One WFE server per 10k users
  • One SQL instance per 4 WFE servers
  • 3-5 WFE Cores per Sql Core
  • One DC per three WFE servers
  • 2-4 GB memory per CPU core
High availability - points to consider
  • Hardware/Software load balancers
  • Cluster/Mirror of Sql server
  • Service applications associated to the app
  • Search target, dedicated/distributed
Sql Server memory recommendation
  • Small - 8 GB
  • Medium - 16 GB
  • To handle up to  2TB of data - 32 GB
  • To handle data from 2TB to 5TB - 64 GB
  • To handle more than 5 TB - Go for new instance of Sql server
As there is no perfect tool (yes there is a tool called HP Sizer is available)to tell you what kind of Farm is required for your purpose, it requires a careful assessment of all these points and the requirement is necessary to arrive at a conclusion. It is always recommended to do oversizing rather than undersizing(Want to know your servers are over sized/undersized, use performance monitor counters to gather data and analyse).

I guess most of the points were taken from this Technet Article. Nevertheless for those who need full set of information please go through that link. I don't find the deck presented from Tech Ed site, will provide a link when it's available.