{"id":593,"date":"2013-03-21T11:28:53","date_gmt":"2013-03-21T11:28:53","guid":{"rendered":"https:\/\/poiseddevelopers.com\/reality-tech\/?p=593"},"modified":"2024-05-07T08:00:52","modified_gmt":"2024-05-07T08:00:52","slug":"generating-automatic-emails-with-embedded-reports-and-link-to-uploaded-csv","status":"publish","type":"post","link":"https:\/\/poiseddevelopers.com\/reality-tech\/generating-automatic-emails-with-embedded-reports-and-link-to-uploaded-csv\/","title":{"rendered":"Generating automatic emails with embedded reports and link to uploaded CSV"},"content":{"rendered":"<h2 data-wpmeteor-mouseover=\"true\" data-wpmeteor-mouseout=\"true\">Generating automatic emails with the embedded content<\/h2>\n<p>There\u2019s often a need to automate the generation of reports on SharePoint. I typically need to generate such a report based on:<br \/>\n1. Content using search: acquired via a Search Query<br \/>\n2. Content via manual crawl: walking through libraries, even sites, site collections and farms<br \/>\n3. Based on scripted actions taken, such as moving or copying documents on an automated basis.<\/p>\n<p>Other examples I\u2019ve done including reports on Checked Out documents, custom delivered to the people who have documents checked out for 30+days.<\/p>\n<p>There are a few parts to the script below:<br \/>\n1. Acquire the dataset for the report. In this example I walk through app SPItems in a given list.<br \/>\n2. Define how the email gets sent, and to whom, leveraging HTML email<br \/>\n3. Generate the HTML for this report<br \/>\n4. Create a temporary CSV of this report, and upload it to a SharePoint library<br \/>\n5. Send the email, ensuring it contains the summary report and a link to the uploaded CSV<\/p>\n<p>To store the dataset, I use the system.Data.DataTable object, and build from there. It\u2019s easy to copy a DataTable, or drop columns after the HTML is constructed.<\/p>\n<p>I use ConvertToHTML CmdLet to convert the DataTable, explicitly referencing the fields and the order I\u2019d like in the HTML table. However I have to fix up the XML within it. For that I pipeline it through a function called Convert-HTMLEscape<\/p>\n<p data-wpmeteor-mouseover=\"true\" data-wpmeteor-mouseout=\"true\">I prefer to tune up the DataTable and drop any unnecessary fields, as ConvertTo-CSV does not allow fields to be specified. I also specify -NoTypeInformation so the first row is the actual field names, allowing it to open correctly in Excel.<\/p>\n<p data-wpmeteor-mouseover=\"true\" data-wpmeteor-mouseout=\"true\">Note how the URLs are prettied up on the fly, so they read nicely in the HTML table.<\/p>\n<p data-wpmeteor-mouseover=\"true\" data-wpmeteor-mouseout=\"true\">Claims Authentication has a different account format, which I clean up in a separate Function Strip-ClaimsHeader.<\/p>\n<p data-wpmeteor-mouseover=\"true\" data-wpmeteor-click=\"true\" data-wpmeteor-mouseout=\"true\">Whether a file is checked out, the type of checkout, and who checked it out are some optional attributes I capture in the DataTable. This table is easily extended to support whichever metadata is desired.<\/p>\n<p data-wpmeteor-mouseover=\"true\" data-wpmeteor-mouseout=\"true\">here\u2019s the script:<\/p>\n<pre lang=\"php\"> \r\nif(!(Get-PSSnapin Microsoft.SharePoint.PowerShell -ea 0)) \r\n{ \r\nWrite-Progress -Activity \"Loading Modules\" -Status \"Loading Microsoft.SharePoint.PowerShell\"\r\nAdd-PSSnapin Microsoft.SharePoint.PowerShell \r\n} \r\n \r\n$ToRecipients = \"joelplaut@MyDomain.com\"\r\n \r\n$CCRecipients = $ToRecipients; #set to any audience you want, semicolon separated\r\n$ToRecipientsArray = $ToRecipients.Split(\",\");\r\n$CCRecipientsArray = $CCRecipients.Split(\",\");\r\n \r\n[void][System.Reflection.Assembly]::LoadWithPartialName(\"Microsoft.SharePoint\")\r\n[void][System.Reflection.Assembly]::LoadWithPartialName(\"Microsoft.SharePoint.Administration\")\r\n \r\n \r\nfunction Save-File([string] $Dir, [string] $PartialName, [system.Data.DataTable] $MyTable)\r\n{\r\n#writes file to the a temporary disk based location.  I add a timestamp to name to ensure uniqueness.  $PartialName is a text description used to start the filename\r\n# export-csv does all the heavy lifting.  Note the attribute \"-NoTypeInformation\" is needed so there isn't a header on top of the field headers\r\n$timestamp = Get-Date\r\n$timestampstr = $timestamp.Month.ToString(\"00\") + \"_\" + $timestamp.Day.ToString() + \"_\" + $timestamp.Year.ToString(\"0000\") + \" \" + $timestamp.Hour.ToString(\"00\") + \"_\" + $timestamp.Minute.ToString(\"00\") + \"_\" + $timestamp.Second.ToString(\"00\") \r\n \r\n$FileName = $Dir + '' + $PartialName+'_'+$timestampstr+'.CSV';\r\n \r\n$MyTable.rows | Export-Csv -Path $FileName -NoTypeInformation\r\n \r\nreturn $FileName;\r\n \r\n}\r\n \r\n#writes file to the Reporting Library within SharePoint, adds timestamp to name to ensure uniqueness\r\nfunction Upload-File([string] $FilePath)\r\n{\r\n    $Files = $RepList.rootfolder.Files\r\n    $DocLibName = $RepList.Title; #this is not necessarily accurate \r\n    #Get File Name from Path\r\n    $FileName = $FilePath.Substring($FilePath.LastIndexOf(\"\")+1)\r\n      \r\n    #delete the File from library, if already exist.  This should never occur\r\n    try\r\n    {\r\n        if($Files.Item($DocLibName +\"\/\" + $FileName))\r\n        {\r\n            $Files.delete($DocLibName +\"\/\" + $FileName)\r\n        }\r\n    } catch {}\r\n    #Get the File\r\n    $File= Get-ChildItem $FilePath\r\n      \r\n    #Add File to the collection\r\n    $x=$Files.Add($DocLibName +\"\/\" + $FileName,$File.OpenRead(),$false)\r\n      \r\n     $ReportFilename = $RepList.ParentWeb.Url + '\/'+ $DocLibName + '\/' + $Filename;\r\n     return $reportFileName;\r\n }\r\n \r\n \r\n# If we are running in a Claims Authentication environment, we can strip off the claims tags\r\nFunction Strip-ClaimsHeader ($s)\r\n{\r\n    if ($s.IndexOf('#') -gt 0)  #handle stripping claims tags off name\r\n    {\r\n            return $s.Substring($s.IndexOf(\"#\")+1)\r\n             \r\n    }\r\n    else\r\n    {\r\n    return $s\r\n    }\r\n}\r\n \r\n#used for HTML processing and fixup; this is a bit dense, but it supports the PowerShell pipeline, and restores XML tags\r\nFunction Convert-HTMLEscape {\r\n \r\n&lt;#\r\nconvert &amp;lt; and &amp;gt; to &lt; and &gt;\r\nIt is assumed that these will be in pairs\r\n#&gt;\r\n \r\n[cmdletbinding()]\r\n \r\nParam (\r\n[Parameter(Position=0,ValueFromPipeline=$True)]\r\n[string[]]$Text\r\n)\r\n \r\nProcess {\r\nforeach ($item in $text) {\r\n    if ($item -match \"&amp;lt;\") {\r\n        &lt;#\r\n          replace codes with actual symbols. This line is a shortcut to do two replacements\r\n          with one line of code. The code in the first set of parentheses revised text with \"&lt;\". This normally gets written to the pipeline. By wrapping it in parentheses it tells PowerShell to treat it as an object so I can then call the Replace() method again and add the &gt;.\r\n        #&gt;\r\n        (($item.Replace(\"&amp;lt;\",\"&lt;\")).Replace(\"&amp;gt;\",\"&gt;\")).Replace(\"&amp;quot;\",'\"')\r\n     }\r\n     else {\r\n        #otherwise just write the line to the pipeline\r\n        $item\r\n     }\r\n }\r\n} #close process\r\n \r\n} #close function\r\n \r\n $env = \"Prod\"  # I like setting flags for changing environment settings\r\n  \r\n if ($env -eq \"Prod\")\r\n {\r\n    $RefListLocation = \"http :\/\/SharePoint\/sites\/mySiteCollection\/Subsite\"\r\n    $TempLoc = \"C:TempLocation\"  # disk location to temporararily store the file\r\n}\r\n \r\n \r\n$table = New-Object system.Data.DataTable \u201cJoelFancyReport\u201d  #this is an internal name, that is not very important to set\r\n \r\n#let's define a field (column) for every possible field; even those we don't yet use in a given report\r\n$col1 = New-Object system.Data.DataColumn Link,([string])\r\n$col2 = New-Object system.Data.DataColumn BusinessProcess,([string])\r\n$col3 = New-Object system.Data.DataColumn DocType,([string])\r\n$col4 = New-Object system.Data.DataColumn Modified,([string])\r\n$col5 = New-Object system.Data.DataColumn Title,([string])\r\n$col6 = New-Object system.Data.DataColumn FileName,([string])\r\n$col7 = New-Object system.Data.DataColumn Editor,([string])\r\n$col8 = New-Object system.Data.DataColumn CheckedOut,([string])\r\n$col9 = New-Object system.Data.DataColumn CheckedOutBy,([string])\r\n$col10= New-Object system.Data.DataColumn FileFolder,([string])\r\n$col11= New-Object system.Data.DataColumn FileLink,([string])\r\n$col12= New-Object system.Data.DataColumn ReportAction,([string])\r\n \r\n# Just add the columns to the table\r\n$table.columns.add($col1)\r\n$table.columns.add($col2)\r\n$table.columns.add($col3)\r\n$table.columns.add($col4)\r\n$table.columns.add($col5)\r\n$table.columns.add($col6)\r\n$table.columns.add($col7)\r\n$table.columns.add($col8)\r\n$table.columns.add($col9)\r\n$table.columns.add($col10)\r\n$table.columns.add($col11)\r\n$table.columns.add($col12)\r\n \r\n#we can create multiple tables with the same layout easily. \r\n# Copy is shown here for reference, and is not used in this simple example\r\n$TableCopy = $table.Copy() \r\n \r\n#loop through whatever SPItems need to be looped\r\n# here is where the basic logic gets put, before generating the actual report\r\n# this is just a placeholder loop of items in a list, but this could be anything, including SPQuery results\r\n \r\n$JPWeb = get-spweb $RefListLocation\r\n$JPList  = $JPWeb.lists[\"MyList\"]\r\n$JPItems = $JPList.items;\r\n$JPItemCount = $JPItems.count;\r\n \r\nfor ($k=0; $k -lt $JPItemsCount; $k++)\r\n{\r\n    $SourceItem = JPItems[$k];\r\n                 \r\n                if ($SourceItem.File.CheckOutStatus -ne \"None\")\r\n                {\r\n                    $ReportAction += \", CheckedOut \"+$SourceItem.File.CheckOutStatus+\" to:\"+$SourceItem.File.CheckedOutBy.displayname;\r\n                    $CheckedOut = \"Yes\"\r\n                    $CheckedOutBy = $SourceItem.File.CheckedOutBy.displayname;\r\n                }\r\n                else\r\n                {\r\n                    $CheckedOut = \"No\"\r\n                    $CheckedOutBy = $null;\r\n                }\r\n                    #let's create a new row \r\n                    $row = $Table.NewRow()\r\n                     \r\n                    $FileLink =   $($JPWeb.Url)+'\/'+$($SourceItem.url) ;\r\n                    $row.FileLink = $FileLink;\r\n                    $row.Link = \"<a href=\"+'\">'+ \" Link \" + \"<\/a>\";\r\n                 \r\n                    $FolderURL = $FileLink.substring(0,$FileLink.LastIndexOf('\/'));\r\n \r\n                    $row.FileFolder = $FolderURL;\r\n                     \r\n                    $row.Title = $item2.Title;\r\n                     \r\n                    $row.FileName= $item2.Name;\r\n                                         \r\n                    $row.Editor = strip-claimsHeader $item2[\"Editor\"];\r\n                     \r\n                    $row.Modified= $item2[\"Modified\"];\r\n                 \r\n                    $row.BusinessProcess = $item2[\"BusinessProcess\"] #sample field; add as many as you'd like\r\n                    $row.DocType = $item2[\"DocType\"]\r\n                     \r\n                    $row.checkedOut = $CheckedOut;\r\n                    $row.checkedOutBy = $CheckedOutBy;\r\n                     \r\n                    $Table.Rows.Add($row)\r\n}\r\n \r\n     \r\n        $RunStatus = \"All the processing completed $(get-date)\"\r\n        $Header = \"Weekly report for my friends\"\r\n        $emailTitle = \"Weekly Report email $($RunSummary)\"\r\n     \r\n     \r\n# this is a nice table style, collapsed, colored header.    \r\n # Let's creat the HTML table, specifying only the columns from the table that we'd like to see.  Any other columns won't appear in this report \r\n# By piping it to the Convert-HTMLEscape function, it fixes up the HTML  \r\n\r\n $MyOutput = $table| ConvertTo-Html Link, BusinessProcess, DocType, Editor, Modified, FileName, Action -head $a -body \"\r\n$($rows.count) $($Header)| Convert-HTMLEscape\r\n   # if we need to reuse table, just make a copy of it first.\r\n   # While Convert-HTML offered the luxury of column selection, the Convert-CSV converts every darn column, so let's remove the extra ones \r\n    now.\r\n        $table.Columns.Remove(\"Link\") #how to remove a Column. links don't look too pretty within a CSV, so for an example I remove it \r\n         here.\r\n\r\n$f2 = save-file $TempLoc ($FileDescription) $Table  #Saves the temp file to disk, driven out of the $Table\r\n        $Report2URL = upload-file $f2\r\n         \r\n        $ReportSummary = \"Rolled Report Available at: \"+\"<a href=\"+'\">'+ $FileDescription+\" Custom Report \" + \"<\/a>\"+\"\";\r\n  #debug technique; it's easy to route the HTML to a file for examination, see the commented out line below:\r\n    #$MyOutput &gt; C:A.html \r\n[string] $From = \"SharePoint Support@MyDomain.com\"\r\n[string] $To = $null; #$Recipients #\"joelplaut@MyDomain.com\"\r\n[string] $Title = $emailTitle;\r\n$SmtpClient = New-Object System.Net.Mail.SmtpClient\r\n$SmtpServer = \"mail.MyDomain.com\"\r\n$SmtpClient.host = $SmtpServer\r\n$MailMessage = New-Object system.net.mail.mailmessage\r\n\r\n$mailmessage.from = $From;\r\nforeach ($recip in $ToRecipientsArray)\r\n{\r\n$mailmessage.To.add($recip)\r\n}\r\nforeach ($recip in $CCRecipientsArray)\r\n{\r\n$mailmessage.CC.add($recip)\r\n}\r\n$mailmessage.Subject = $Title\r\n$mailmessage.Body = $myoutput #\"Body\"\r\n$MailMessage.set_IsBodyHtml($true)\r\n$smtpclient.Send($mailmessage)\r\n<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>Generating automatic emails with the embedded content There\u2019s often a need to automate the generation of reports on SharePoint. I typically need to generate such a report based on: 1. Content using search: acquired via a Search Query 2. Content via manual crawl: walking through libraries, even sites, site collections and farms 3. Based on [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":596,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[31],"tags":[],"class_list":["post-593","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-alerts"],"acf":[],"_links":{"self":[{"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/posts\/593","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/comments?post=593"}],"version-history":[{"count":12,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/posts\/593\/revisions"}],"predecessor-version":[{"id":3824,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/posts\/593\/revisions\/3824"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/media\/596"}],"wp:attachment":[{"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/media?parent=593"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/categories?post=593"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/poiseddevelopers.com\/reality-tech\/wp-json\/wp\/v2\/tags?post=593"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}