Friday, July 28, 2017

How to inventory all Repliweb Jobs

Recently got an opportunity to explore Attunity's Repliweb, a content deployment tool which can deploy packages across multiple targets from a source code repository. Although the GUI tool provides options to manage the deployment and their resulting logs, it lacks options to export the inventory of all jobs.

But it exposes API via PowerShell to take the inventory. As I was not able to find such script anywhere, thought of sharing it via this post.

Below script creates a csv file in the current directory which will contain all jobs along with their status and edges list.

<#
.SYNOPSIS
Fetches all jobs from Repliweb center
.DESCRIPTION
Fetches all jobs from Repliweb center
.PARAMETER username
A repliweb username
.PARAMETER password
A repliweb password for the username provided
.PARAMETER domain
Domain name for the username
.EXAMPLE
.\fetchJobs.ps1 -username "JohnDoe" -password "secretword" -domain "contoso"
.NOTES
Run this script in a repliweb center's powershell console as administrator.
#>


[CmdletBinding()]
Param(
[parameter(mandatory=$true)]
[string] $username,

[parameter(mandatory=$true)]
[string] $password,

[parameter(mandatory=$true)]
[string] $domain
    )

if ((Get-PSSnapin -Name r1_ps_api -ErrorAction SilentlyContinue) -eq $null )
  {  
      Add-PsSnapin r1_ps_api
  }
##-----------------------------------------------------------------------

## Global Constants
$fileName="$($env:Computername)_$(get-date -format `"yyyyMMdd_hhmmsstt`").txt"
$currentDir=Split-Path $script:MyInvocation.MyCommand.Path
$logfile =  join-path $currentDir $fileName

##-----------------------------------------------------------------------
##-----------------------------------------------------------------------
## Global functions:
function write-log([string[]]$logline)
{
  $logline | out-file -Filepath $logfile -append
}
##-----------------------------------------------------------------------

$R1Session=Get-R1Session
$pass=$R1Session.ScramblePassword($password)
$R1Session.Center="localhost"
$R1Session.User=$username
$R1Session.ScrambledPassword=$pass
$R1Session.Domain=$domain
#$R1Session.QueryFilter.JobType= "R1_DISTRIBUTION"
$connected=$R1Session.Connect()
$jobs=$R1Session.GetJobList()
write-host " Session established and jobs retrieved"
$header="ID,Name,Type,State,SourceDirectory,Stage,Edges"
write-log $header
foreach($job in $jobs)
{
$edges=''
    #job.info = DistributionInfo or MSIInfo
foreach($edge in $job.info.edgesNames)
{
$edges+= $edge + ";"
}  
 $logline=$job.Id + "," + $job.Name + "," + $job.Type + "," + $job.State + "," + $job.SourceDirectory + "," + $job.Stage + "," + $edges
write-log $logline
}
write-host " Export process completed."

Wednesday, May 31, 2017

My journey towards AWS certification

I've cleared the AWS solutions architect - Associate certification last week and would like to share the reference materials which I used for the preparation. Entire exam was focused towards Elastic compute cloud, VPC, IAM and auto scaling. For any one who is trying to get the certification following reading materials will help. As a pre-requisite register for free trial in AWS which will give you 750 hours of EC2, 5 GB in S3 and 750 hours of RDS


Ryan Kroonenberg's Udemy course

Good to get your feet in AWS environment. The purpose of this course is just to get ready for certification but not to get in to the depths of AWS concepts. Every video provides a hands on exercise which will make you understand the concepts by working through it.
Tip : This course is always on a sale and available for $10, if it is not then try in a different browser or browse in incognito mode.

AWS Certified Solutions Architect Official Study Guide

This is the official study guide penned by technical bigwigs at Amazon. It provides in-depth knowledge on each areas of AWS. Every chapter provides a sample quiz and exercise to practice the hands on. If you've done the Udemy course above then most of the exercise is already covered, you can complete the rest of the exercise with your reading on this book.

The only drawback is, this book is bit obsolete, atleast of two re-invent events. So there are new features which will be missing in this book like Enhanced networking, revamped S3 UI, NAT gateway, licensing, few capacity related concepts etc

Wiley test bank

This study guide offers 300+ quiz questions for you to practice. The drawback of the book follows here, some of the question and answer are obsolete. you have to double check with the product FAQ page for the latest changes.

Qwiklabs

This is a paid version of doing hands on exercise, but if you've a free trial in AWS then you can read the instruction provide in here and execute most of the steps in your AWS account. If you are willing to pay, they'll offer points and they create dynamic AWS account every time you invoke a new lab.

Whitepapers and FAQs

Go through the recommended whitepapers(8 whitepapers) and FAQ's (6 faqs) to understand the most recent updates.

Cloudacademy

Cloudacademy has around 300+ quizzes to practice around. They offer time bound quizzes where you will be forced to answer the questions in pre-determined time based on the complexity of the question. Good thing is they offer seven day trial period. So I picked this as the last step in my preparation just to benchmark my reading. They'll charge your credit card $30 per month, if you forget to cancel the trial subscription after seven days.

Sunday, May 28, 2017

Mindmaps from my AWS cert preparation


Following are some mind maps which I prepared as part of reading for AWS certification.

Databases



Elastic compute cloud - EC2



Identity and Access management


Simple Storage Service - S3


Storage Gateway


Amazon's Key services

Managed Services



Friday, July 22, 2016

FireFox and Certificate authentication

When I visit a SharePoint site which is secured with PKI certificate authentication, I usually get certificate selection prompts properly in Internet Explorer and Chrome. Both IE and Chrome are smart enough to show me list of certificates available in my Personal Certificates store.

But when I browse the same PKI secured site in Firefox, I didn't get the certificate selection page and thought of poking around that issue for a solution today. The solution is Firefox is not looking in your personal certificates store. So even though you've imported your certificates to Personal certificates, you should also import them to Firefox certificate store too.

First export you soft key to a folder from your personal certificates.

Open FireFox, navigate to Tools->Options->Advanced->Certificates->View Certificates

After this setting, you'll be shown a certificate selection dialog box every time you browse a site authenticated with a certificate.

Tuesday, July 19, 2016

Hide a SharePoint ECB menu

 There was a requirement to hide a  menu item in  a  document library. The specific menu item is "Compliance Details".

After poking around JavaScript for sometime and the CSS fix looks elegant because of the CSS selectors.

Add a script editor web part and add the below

<style>
li.ms-core-menu-item[text="Compliance Details"]
 {display: none !important;}
</style>

This is actually find all "li" with class "ms-core-menu-item" where the attribute "text" value is "Compliance details" then assign the new style.

CSS selectors allow us to match any menu item by matching them with a display text of the menu.

Wednesday, June 29, 2016

Site Collections Size report

Wanted to quick script to get all site collections size along with their last modified dates, visits. PowerShell is cool enough to do that in two lines.

Get-SPSite -limit All | select url, @{label="Size in MB";Expression={"{0:N2}" -f ($_.Usage.Storage/1000000)}},@{label="ContentDB";Expression={$_.ContentDatabase.Name}},@{label="Visits";Expression={$_.usage.Visits}},@{label="lastmodified";Expression={$_.RootWeb.LastItemModifiedDate}}  | epcsv SiteReport.csv -NoTypeInformation

Thursday, June 23, 2016

Reviving datasheet view in SharePoint 2013 for selective sites


 In SharePoint 2013 the datasheet view is deprecated by Microsoft and Quick edit was promoted. Datasheet view is an ActiveX component which has tight dependency with Internet Explorer and Quick Edit is based on HTML 5, CSS and JS achieving cross platform and cross browser compatibility.
                Datasheet view in legacy platform is ideal for doing bulk row edits and inserts without facing performance issues with few caveats.

How Data Sheet view works?

Datasheet view is an ActiveX component called ListNet from STSList.Dll which accept the below parameters to work properly
  • ListName
  • ViewGuid
  • RootFolder
  • ListWeb
  • ListData
  • ViewSchema
  • ListSchema

Pros and Cons

 Pros

  •  Users will get the legacy control to work in the same fashion without the need of exporting to excel.

Cons

  • Document Center is not supported.
  • Managed metadata is not supported.
  • Cross browser and Cross platform incompatible.

Wednesday, July 2, 2014

SharePoint grayed out Multiple file uploader option

After upgrading to Internet Explorer 11 and Office 2013, while trying to do multiple file upload to a SharePoint document library, it was grayed out.

I've ensured the STSUPLD control in Addons menu, verified the 32 bit version of Internet explorer is loading the site. But still there is no luck and also i'm unable to access the Datasheet view of the document library or a list.

The trick is to un-check the "ActiveX Filtering" option from Tools menu. Additional browser security...




Thursday, March 14, 2013

Deploy SharePoint solutions at web application level

SharePoint solution packages by default get installed to "Globally" scope when you don't have any resources which are web application scoped. It is not a logical decision to deploy in web application scope when I have Assemblies to GAC, Application pages to Layouts folder etc. If there are any resources which are targeted at web application scope like deploying assemblies at Application bin directory, safe control entry etc then the solution deployment will force us to deploy at web application level.

Some times you will have assemblies to GAC, not a web part project and don't have any resources which are scoped against web application. At this juncture, you may need to deploy the solution at web application scope for various reasons like

  • Security  - To hide features part of the solution from showing off in other web applications.
  • Shared Environment - Other applications running in the Farm and they don't to get impacted of your WSP
  • Isolation - Having a global scoped WSP will recycle all app pools during deployment
This can be achieved by adding a safe control entry to the solution package and solution deployment script will be satisfied by this. If it is MOSS 2007 then its matter of Manifest.xml modification. But SharePoint 2010 is powered with Visual Studio 2010 tools for SharePoint, if we make any modification to Manifest.xml then the Package designer will ignore its support to the developer.

The solution would be, open your package designer, scroll down and find the Advanced Tab, click on "Add" button to add additional assemblies.



Select "Add assembly from project output", input the namespace and full assembly name for the safe controls.

Save these settings and hit on Ok button,one more setting to update. Click on the project in solution explorer and press F4 to open the properties window, exclude the assembly from being copied to solution package.Because we already added the assembly in "Additional Assembly" section.


Package and deploy the solution at web application level scope.

Thursday, February 14, 2013

Portable mail server on desktop for SharePoint

I used to run an SMTP server which comes along with IIS 6.0 whenever I need to test an outgoing emails from a custom SharePoint solution. After a long time I hate running a server process just to receive a bunch of emails relayed within my development machine and also due to compliance, network administrators may block the SMTP server in a desktop.

Here comes a nifty tool from Codeplex called Papercut for all our problems. Ok, how to resolve the SMTP puzzle ?


  1. Download Papercut from CodePlex, unzip the zip package (Note: there is no msi package or a setup.exe, so you are not violating software compliance too) and keep it in a handy location
  2. Find your machine IP first by using ipconfig and make a note.
  3. Map your IP to SharePoint’s outgoing mail settings in SharePoint Central Administration console.



Open PaperCut.exe -> click on “Options” -> Map your machine IP here. Keep it open and running, do not close.


Now test all your email functionalities and you should be seeing all emails in the left pane with time stamp.

Watch all your mails are being forwarded to Papercut.exe.


Wednesday, January 30, 2013

SharePoint User Profile Service - method not found exception

While trying to access user profile service application in few machines, there came this exception


Method not found: 'Microsoft.SharePoint.Administration.SPIdentifierType Microsoft.SharePoint.Administration.SPAce`1.get_BinaryIdType()'

we were unable resolve by re provisioning the service application, restarting the timer jobs and restarting the servers. As this exception arises from a SharePoint assembly, looked at the version of the server and found it is SharePoint 2010 RTM version.

It requires following updates to resolve the issue


Don't forget to run the PSConfig wizard after installing Service pack and cumulative updates.

Re-provision the User profile service application and proceed.

Tuesday, October 30, 2012

Shredded Storage in SharePoint 2013

How many times we thought through before enabling versions for a specific document library which will handle files of huge size during the capacity planning exercise in MOSS 2007 and SharePoint 2010.

After enabling versions in a document library, imagine there is a DSTD.docx of 20 MB word document of v1.0 in that document library and user is attempting to edit this document. SharePoint 2010 server will serve the full document to the MS Office Word Client. After making changes to couple of page, when the user hit the Save/Sync button, SP 2010 gets the full 20 MB modified document and stores it as a seperate entry in content database.

Say if we run across ten major versions of this specific document in SP 2010, the disk space occupied by this document in Sql Server content database would be 20MB * 10 versions= 200 MB

SP2013 changes this scene by using the Shredded storage technique, if you were a DB admin or an IT Pro the you should have heard about Differential backups or incremental backups. Shredded storage is similar to this.

In SP2013,   when the user hit the Save/Sync button after making changes in DSTD.docx v1.0 , SP 2013 gets the delta changes in that modified document and add an entry in content database.

So in SP 2013 enabling the versions is not gonna eatup disk storage, reduces disk I/O as only the delta is getting stored and efficient use of network bandwidth.

Monday, July 16, 2012

No results in SharePoint Search but it exists in index

SharePoint Search crawler succeeds crawling the content and the same is available in crawl log. There is no crawl log error, content is a plain MS Word document, Web Application is properly associated with Search Service application but unable to see not even a single search result.

This was bit embarrassing and to nail down this issue, I was trying look in to Windows Event logs and SharePoint ULS logs, but there is no interesting message there for me to troubleshoot.

Removed the SSA association and deleted the SSA. Created a new SSA and did the association again. But the result is "none"

Verified that the Search service account has read permission in the web application and in the database as well. They exists perfectly but i don't not even a single search result.

Finally enabled the "Verbose" mode ULS logging and tried to filter using "Query Processor" category. There was new things started appearing in verbose mode

AuthzInitializeContextFromSid failed with 1355. The querying user's Active Directory object may be corrupted, invalid or inaccessible. Query results which require non-Claims Windows authorization will not be returned to this querying user.

Not sure why this exception occurs under "Unexpected" category but the recently I've configured a local admin account as service account through Power shell scripts. Probably this might be the culprit.

After hours of re-search found an MS KB article, which explained the same symptoms and cause.

Open up your SharePoint 2010 Management Shell console as an administrator and run the following PS script.

$searchApp = Get-SPEnterpriseSearchServiceApplication "SSA Name"
$searchApp.SetProperty("ForceClaimACLs",1)
Do a full crawl on all content sources to get the search results. Don't forget to switch off the verbose mode ULS Logger.

Wednesday, July 4, 2012

Render a spreadsheet as HTML with Excel REST services

                             An interesting question from one of my peer, whether we can show excel spreadsheet using REST services and they don't want all the features of Excel Web access web part. Just show the spreadsheet on a web page.

                            This question drove me to take a look at the capabilities of Excel services. IMHO Excel services were made for calculation and resource-heavy number crunching jobs along with a Excel web access web part which is used to render the sheets. But after looking at the REST API(yes it's beautiful), it totally revamped my thinking about this service.

First things first

  1. You need an SharePoint 2010 Enterprise Sever and Excel services should be configured properly in that
  2. Make sure you see ExcelRest.aspx under \14\ISAPI.

How to use this ExcelRest.aspx ?

       Say you have an Excel file stored in a document library path like below

http://Foo/Site/Shared Documents/Employee.xlsx
then the URL will be
http://Foo/Site/_vti_bin/ExcelRest.aspx/Shared Documents/Employee.xlsx

           If you are like me and worked a lot with asmx web services and hitting just the endpoint http://Foo/Site/_vti_bin/ExcelResta.aspx you will get HTTP 400 error because it's simple, REST is representational and nothing is represented here, its just an endpoint.

How to show as HTML and what are the other options ?


  1. /_vti_bin/ExcelRest.aspx/Shared Documents/Employee.xlsx/Model - Find more subsects(Ranges,Charts,Tables, PivotTables) of what you are going to query.
  2. /_vti_bin/ExcelRest.aspx/Shared Documents/Employee .xlsx/Model/Ranges('Sheet1!A1|H15')?$format=html - From sheet 1, Fetch the data specified in the range and render in HTML.
  3. /_vti_bin/ExcelRest.aspx/Shared Documents/Employee.xlsx/Model/Charts(‘RevenueChart’)?$format=image - Fetch the chart and render it as a PNG image

What are the formats available for me to render ?


  1. ?$format=html  - Renders as plain HTML fragment, for tables,sheets and pivot tables
  2. ?$format=atom  - provides you a ATOM feed
  3. ?$format=image - Renders an PNG image, only supported for Chart outputs
  4. ?$format=workbook - Downloads the whole workbook.
Although there are some unsupported features, given the abilities of OOTB Excel services luxuries like load balancing the requests,caching etc it is worthwhile in a scalable solution.

Want to learn more, refer MSDN

Monday, May 14, 2012

Prevent your mails from being forwarded or replied

I always amazed at the rights management plugin with IRM services provided by Microsoft which restricts me from forwarding/replying a protected mail. But what is the case of small companies who dont have these IRM luxury on their infrastructure.

There is a niche solution provided by Microsoft but still that is a Micrsoft Research project. A Outlook plugin called "NoReplyAll", a light weight plugin installs right in to outlook 2007/2010. It adds up additional buttons on the ribbon like below


So whenever I send an email, I need to choose from the above option to make my mail secure. The end user who is receiving the mails don't need this plugin in their outlook. This plugin makes use of flag settings in Exchange and outlook, so if the recipient is not using any one of these then the flags are not honoured.

This cool plugin mandates the following 
  • Outlook should be the mail client on sender and recipient's machine
  • Both users should be in the same domain using Exchange

If you want robust and rigid security then this plugin is not your choice, you should head for IRM.

Now you gotta plugin which will avoid accidental and unnecessary "reply alls", Download/Deploy it in your Outlook and enjoy.

It is a research project not an RTM product, so beware.

Thursday, April 26, 2012

SharePoint performance tuning

There are various factors which comes in to scene when we enter the performance tuning for SharePoint, lets see the major players

HTTP Compression
  • It provides on the fly compression on files when served from the web server to the browser. 
  • It can be configured on file type extension(htm,js,css etc).
  • This setting applies on web application level and in disabled state by default.
  • Compression puts pressure on Server's processor but gives an instant boost to the application
  • HTTP sniffing tools like Microsoft Fiddler or HTTP Watch can be used to verify the traffic.
Browser Caching
  • Modern browsers come with private caching ability up to 1 GB.
  • It can improve performance for sites which contain more static pages, can be tweaked in IIS HTTP response headers(cache-control: max-age=3600, post-check”)
  • Easy to configure and implement 
BLOB Caching
  • BLOB Caching stores all of its content on disk , it is also configured based on the file types
  • Configurable in web.config the file types, max size(10 GB default) and enable/disable switch.
  • It works at the web application level, this option is disabled by default.
 Object Caching
  • Object caching caches site navigation structure, published pages and resources, draft pages etc.
  • It is enabled by default at site collection level, the Farm admin might see it a bit daunting task to manage across all site collections :(.
  • It also stores cached objects on disk and shares the space with BLOB cache.
  • Need to be bit careful before increasing or decreasing the allocation of this Cache control, any unplanned configuration can deteriorate the existing performance
Browser Connections
  • Internet Explorer 7 and earlier versions limit the number of concurrent file download as two at a time. IE 8 relaxed this limitation to six.
  • When you work on faster connections, this browser setting might be a bottleneck for users in seamless performance.
  • Microsoft provides a fix for this limitation  http://support.microsoft.com/kb/282402.
  • This restriction was imposed by IETF RFC 2068 Page 45 and the explanation goes like this
 "Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD maintain AT MOST 2 connections with any server or proxy. A proxy SHOULD use up to 2*N connections to another server or proxy, where N is the number of simultaneously active users. These guidelines are intended to improve HTTP response times and avoid congestion of the Internet or other networks."
Other options
  • Now a days Proxy servers & Load balancers are supporting static content caching, verify the feasibility on this.
  • ISAPI filter from Aptimize can reduce significant performance bottlenecks.

Tuesday, December 27, 2011

Hyper V VM failed to set/change partition property error

Tried setting up a virtualized web farm in machine which is running Intel i5 with Windows Server 2008 R2 standard RTM. Hyper V manager refused to start a virtual machine in a brand new host machine. It has got enough of memory and CPU cores to start the virtual machine.

It keep on showing VM could not initialize < VM Guid > and "failed to set/change partition property" in event viewer.

After an hour of googling found a KB 2517374 article with reference about Intel AVX technology in Sandy bridge processors and some interesting stuff from Jeff Woolsey's Chicken and egg case.

The base funda is Intel's sandy bridge processors were released after the release of Windows Server 2008 R2 RTM, so it doesn't contain the bits to leverage new bells and whistles. As there is a slight change in processor architecture itself, it refuses to start the virtual machines.

Either upgrade to Service pack 1 to make use of Intel AVX in host and guest or install the above mentined hotfix to disable the AVX for the guest OS's

Upgrade to service pack 1 and enjoy the power of AVX(high performance floating point calculations) in both of your guest and host.

Friday, December 23, 2011

SharePoint List items bulk delete

Recently while working on a production case, we needed to delete around 2000 list items in a single shot. My idea was to use the data sheet view ActiveX control to do the bulk delete. But in production case where the server runs on 64 bit and clients run on 32 bit, there was problem invoking the ActiveX control and shared  the following snippet to do a batch delete.

private static void deleteAllListItems(SPSite site, SPList list)
{
StringBuilder sbDelete = new StringBuilder();
sbDelete.Append("<?xml version=\"1.0\" encoding=\"UTF-8\"?><Batch>");
string command = "<Method><SetList Scope=\"Request\">" + list.ID + "</SetList><SetVar Name=\"ID\">{0}</SetVar><SetVar     Name=\"Cmd\">Delete</SetVar></Method>";
foreach (SPListItem item in list.Items)
{
Console.WriteLine(item.Title);
sbDelete.Append(string.Format(command, item.ID.ToString()));
}
sbDelete.Append("
");
Console.WriteLine("now deleting");
site.RootWeb.ProcessBatchData(sbDelete.ToString());
}

This snippet deletes that huge list much faster than traditional object model way.

Monday, September 5, 2011

Setup a Dell Blue tooth mouse on Ubuntu 11.04

When the UI way is not working, it's time to go back to your terminals.

Get the python scripts related to blue tooth device, open your terminal window and enter

sudo apt-get install bluez python-gobject python-dbus

Check your hciX location

hcitool dev

Attach your blue tooth device to Ubuntu

sudo bluez-simple-agent hci0 xx:xx:xx:xx:xx:xx

xx:xx:xx:xx:xx:xx is your Blue tooth device address and it varies for device to device. Flip your mouse and look at the bottom label, you'll find some thing like "BT ADD: xx:xx:xx:xx:xx:xx". Now press the blue button at bottom to flash the device to detect.


You'll be prompted for the PIN now, it is also available in the bottom label usually it is 0000. Key in the same to pair the device.

Mark the device as trusted device

sudo bluez-test-device trusted XX:XX:XX:XX:XX:XX yes

Restart the blue tooth daemon service
#sudo /etc/init.d/bluetooth restart

Verify the device is added successfully
dmesg|tail
Towards the end, you'll see something like below

Bluetooth: HIDP (Human Interface Emulation) ver 1.2

Now go to your top right toolstrip and click on the blue tooth icon, choose the device name (Dell BT Travel Mouse) and click on Connect.

There you are, after a bunch of command line (or shell ) commands finally got my blue tooth mouse connected.

Monday, August 29, 2011

Resolving FAST Document conversion Failed error

After installing and configuring Microsoft FAST Search server, uploaded a bunch of word, pdf documents and images for crawling and indexing. Fingers crossed and waited till the Full crawl is completed in Crawl history page. Logs said that the documents were crawled successfully and ready to be queried from FAST index.

I was able to search a PDF document based on its metadata but not based on its inner content. Basic instinct goes for installing a PDF iFilters but the server behaviour is same even for Microsoft Office file formats and Jie Li's blog states that FAST comes with PDF iFilters in-built.

Tried using the DocPush from FAST toolbox as an exercise of isolating SharePoint from FAST,

docpush -c sp c:\test.doc

Docpush clearly provided me the error message "WARNING Document conversion Failed (Warning code 0)". Verified the FAST crawl logs and found the same error message there for Office file formats and PDF documents.

To resolve this you need to provide "Full Control" rights for FAST service account to folders C:\FastSearch\bin and C:\FastSearch\Tmp

Refer : KB2554579