Modelo de apps en detalle (IV): Construyendo una app High-Trust

En la última entrega de esta serie de posts sobre el modelo de apps en detalle, hablé de las aplicaciones Low-Trust y aplicaciones High-Trust de SharePoint 2013. Pues bien, hoy vamos a meternos en la harina y hacer una aplicación high-trust (también conocida como S2S, server-to-server) desde cero.

Preparando el entorno

Servicio de perfiles

Para comenzar a crear una aplicación high-trust, necesitamos configurar varias cosas en nuestro entorno local de SharePoint 2013. (Recordáis que las app high-trust sólo se pueden tener en un SharePoint on-premise, ¿verdad?). Las configuraciones no son muchas, pero es fácil olvidarse de una de ellas y luego tendremos problemas para investigar de donde viene el error.

Antes que nada, necesitamos que nuestro SharePoint 2013 tenga activo el servicio de perfiles de usuario y que además tenga indexados los perfiles de los usuarios que vamos a utilizar en la aplicación. Esto es necesario porque el servicio de autenticación de una app high-trust necesita “encontrar” el usuario en el servicio de perfiles de SharePoint para poder ejecutar las consultas en su nombre. Si el perfil del usuario no está, la autenticación fallará.

image

Realmente, el token de acceso que envía la app hacia SharePoint contiene el identificador del usuario, y SharePoint se basa en él para saber si el usuario es válido o no, buscándolo en su base de datos de perfiles. El identificador suele ser el SID del usuario Windows, su UPN o nombre de usuario de Active Directory. Si usamos otros sistemas de autenticación como FBA o Claims, los identificadores serán otros. Es estrictamente necesario que el identificador del usuario esté presente en su perfil y que no haya repeticiones. Si os pica mucho la curiosidad, hay un excelente post de Steve Peschka al respecto.

Certificado SSL

Para poder firmar el token de la app, necesitamos un certificado SSL. Mientras desarrollamos, podemos usar un certificado de desarrollo firmado por nosotros mismos (self-signed certificate). Luego, en producción, usaremos un certificado real.

Además, para que nuestra app se pueda comunicar con SharePoint de manera segura, necesitamos que la comunicación esté encriptada bajo HTTPS. Para ello, necesitaremos otro certificado SSL con la URL de la app. Esto no es necesario en desarrollo, donde podemos relajar la restricción y usar HTTP, pero en producción esto sería una imprudencia seria.

Para crear un certificado autofirmado, iremos a la consola de IIS y bajo el apartado “Server Certificates” y dentro de él la opción “Create Self-Signed Certificate“. Le daremos el nombre CertificadoHighTrust.

imageimageimage

Al final, exportaremos el certificado incluyendo la clave privada. Como contraseña le pondremos “password“. Al final, tendremos un fichero PFX con el certificado digital que usaremos en nuestra app. Este fichero tiene que estar en una carpeta accessible desde Visual Studio. En nuestro caso, como estamos desarrollando en una máquina de SharePoint, no tenemos que mover el fichero y lo tendremos en la ruta C:\Certificates\CertificadoHighTrust.pfx.

imageimage

También haremos una exportación del certificado sin la clave privada, para obtener el fichero CertificadoHighTrust.cer. Para ello, tenemos que ir a “Server Certificates” dentro del IIS, abrir el certificado y en la pestaña “Details” ir a la opción “Copy to file” indicando que no queremos la clave privada.

imageimageimage

Ahora vamos a comprobar los permisos necesarios para que SharePoint pueda procesar nuestros certificados. Los requerimientos son dos:

  • El application pool SecurityTokenServiceApplicationPool tiene que tener permisos de lectura sobre la carpeta de los certificados
  • El application pool de la aplicación web en la que instalaremos la app (en nuestro caso, la del puerto 80) tiene que tener permisos de lectura sobre la carpeta de los certificados

En nuestro caso, son las cuentas SPFarm y SQLSvc. Les daremos los permisos correspondientes en la carpeta Certificates.

imageimage

Ahora tenemos que hacer que SharePoint reconozca nuestro certificado. Abrimos una consola PowerShell de SharePoint y registramos el certificado como de confianza.

image

Configurar trusted issuer

Una vez que SharePoint se fía de nuestro certificado, vamos a proceder a configurar lo que se conoce como un “emisor de confianza“. Esto no es más que indicarle a SharePoint que los tokens firmados por un “emisor de confianza” son de fiar. Y, ¿cómo sabe SharePoint qué un emisor es de confianza? Primero, el ID del emisor (un GUID que va dentro del token) tiene que existir en la configuración de SharePoint. Segundo, el token tiene que estar firmado por un certificado del que SharePoint se “fía” porque tiene su parte pública. Como esta parte del certificado la hemos hecho ya, sólo falta decirle a SharePoint el ID de nuestro proveedor de confianza. Puede ser cualquier GUID, y aquí vamos a utilizar el aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee (si usamos letras en el GUID, tienen que ser en minúscula). Bonito y fácil de recordar, ¿verdad?

Para registrar nuestro emisor de confianza, hay que ejecutar el siguiente código en PowerShell, a continuación del script de importación del certificado:


 

Ya podemos proceder a desarrollar la app, pero antes de esto vamos a permitir el uso del certificado autofirmado relajando los permisos de autenticación. (ojo: esto se puede hacer sólo en los entornos de desarrollo, nunca en producción).


 

Desarrollando la app

La app necesitará el certificado SSL y conocer la contraseña de su parte privada. Además, la cuenta bajo la que se ejecutará la app (el application pool del IIS) tiene que tener permisos para acceder a la ubicación del certificado.

Abrimos Visual Studio 2013 y creamos una app de SharePoint 2013. Al salir el asistente, le indicamos que queremos una app provider-hosted y que la identidad de la app se establecerá mediante certificado.

imageimageimage

Ahora tendremos una aplicación (en mi ejemplo, creada con Web Forms) que muestra el nombre del sitio actual de SharePoint donde está instalada la app. La solución consiste en dos proyectos: el proyecto de la app de SharePoint y el proyecto web donde está la lógica de la app.

image

El código que hace la llamada a SharePoint es muy sencillo:


 

 

Como se puede ver, el contexto de SharePoint se establece usando la clase auxiliar TokenHelper con el método GetS2SClientContextWithWindowsIdentity. Esta llamada obtiene un contexto de high-trust app (S2S, server-to-server) usando la identidad del usuario Windows que está ejecutando la aplicación. Esta es la configuración por defecto, pero se puede modificar para usar la identidad federada por ejemplo.

Ejecutando la aplicación, nos sale el diálogo de otorgar permisos a la aplicación, y al aceptarlo, podemos ver el título del sitio de SharePoint, “Home”.

imageimage

Dentro del TokenHelper

Vamos a ver como nuestra aplicación construye el token. Si miramos el método GetS2SClientContextWithWindowsIdentity, veremos que su cuerpo tiene 4 lineas de código.

Primero se obtiene el dominio (realm) de la aplicación. Acto seguido se obtiene un JWT (JSON Web Token) que contiene los claims del usuario actual de Windows. Una vez que tenemos el token JWT, lo empaquetamos en un token de acceso con el método GetS2SAccessTokenWithClaims. Al final, el token de acceso lo intercambiamos por un contexto cliente de SharePoint.
La parte interesante es ver como se hace el token. Si miramos el método GetS2SAccessTokenWithClaims, veremos que acaba en un método IssueToken, que está construyendo el token de acceso.

 

El siguiente artículo de MSDN explica las partes del token de acceso que construye nuestra aplicación. En esencia, se construye un token de “actor” que identifica el usuario actual. Este token de actor se hace con los claims del usuario creados previamente. El token se expide para la aplicación actual (parámetro “aud“) y lo firma nuestro certificado (propiedad SigningCredentials). Este token interno se rodea de un token externo que no está firmado digitalmente.
Como se puede ver, el misterioso “token de acceso” es nada más que una cadena de texto con datos en formato JSON que describen una identidad de aplicación y de usuario.
La última parte “misteriosa” es saber como obtiene SharePoint el objeto ClientContext a partir de un token? Es muy sencillo: adjunta el token de acceso en la cabecera de la petición a la API, y al volver ya estará inicializado correctamente el contexto. Ahora lo veremos con Fiddler.

Las pruebas con Fiddler

Si abrimos Fiddler para ver el tráfico HTTP entre la aplicación y SharePoint, veremos que la aplicación hace una llamada a la API CSOM (/_vti_bin/client.svc/ProcessQuery). Si miramos la petición, en las cabeceras veremos un parámetro llamado Authentication con el valor “Bearer: ” seguido de un texto codificado en Base64. Este es nuestro token de acceso.

image

Si usamos alguna herramienta como JWT.io para descodificar el token, podemos ver su estructura.

image

Para más información sobre la estructura del token, hay un magnífico post de Kirk Evans al respecto.

Conclusión

Espero haber desmitificado un poco el mundo de las aplicaciones High-Trust con este post. Como veréis, nos permite usar el modelo de apps sin tener que estar en la nube, lo que es un paso importante para poder adaptar nuestros desarrollos a los escenarios híbridos que parece que serán mucho más habituales en el futuro.

¿Habéis trabajado con este modelo? ¿Podéis compartir vuestras experiencias? ¡Los comentarios os esperan aquí mismo, debajo de este post!

Exposing BLOB Data in Child Entities With Business Connectivity Services

Another interesting issue arose the last week. I was tasked with implementing a BCS .NET connector to a OTRS web issue tracking service, as I mentioned earlier. The icing on the cake was extracting the binary data (issue attachments and images) and showing them in SharePoint leveraging Business Connectivity Services (BCS) in SharePoint.

I found out a post on how to use BCS to expose SQL Server data, which was not applicable in my case. I also had the extra difficulty of having the attachments in a child entity. The OTRS Ticket was the primary entity in my model, with OTRS Article being a child entity with 1:N relation in between). The attachments were properties of the article in OTRS but in the model I attached them to the Ticket in order to be more accessible.

So, I struggled to build a model that had to comply with 2 goals:

  • expose entities fields with BLOB data (Attachments and their binary data)
  • the BLOB entities should be on the child side of the relation

In this post I will show you how to achieve these goals step by step with Visual Studio.

What we want to achieve?

This is the model that I’d like to end with. It has a main entity called Product and a set of child entities called Photo. The child entity has a compound key of both the ProductId and it’s own PhotoId. It also has a binary field called PhotoData together with MIMEType and FileName fields that will govern how to expose the photo to the browser.

Those are the minimum three components for binary data BCS compatibility: MIME type, binary content and a file name.

image

We will model these two entities in a custom NET assembly connector. For brevity, I will fake the external service and return hardcoded data read from picture files embedded inside the connector.

Building the Connector

The first step is to create a new Business Data Connectivity Model named ProductModel in Visual Studio.

image

Visual Studio will create a new entity called Entity1 and will implement the entity sample methods X and Y, together with a NET implementation of the entity. The main problem with BCS development is that the metadata has to match hand-in-glove with the implementation, and also the BCS metadata has to match internally.

image

We will begin filling the Entity1 and changing its name to Product. We’ll also change the name of the model from BdcModel1 to ProductModel. After the renaming of several files and nodes in the BDC Explorer, we’ll have something like this.

image

As you can see, the Product entity has ReadItem (gets a product by its ID) and ReadList methods (gets a list of products). The methods are declared in the BDC model (left side) and their code will reside in ProductService.cs class.

We will model the Product entity operations first, as every change in the model triggers a change in the code that is generated in ProductService class. First, we’ll change the Identifier1 field of the Product entity into a ProductId of Int32 type.

image

Modeling Mayhem

Then, in the BDCM editor we’ll select the methods and fill their details in the BDC Method Details pane. This is the tricky part of modeling BDC: it’s very easy to do this wrong. Luckily, the BDC Explorer lets us copy and paste metadata to save time. First, we will model a ReadList operation. It will take one no parameters and will return a "Return" parameter of type "Collection of Product", which will be our entity metadata. Take a look.

image

When we edit the metadata, we have the following BDC Explorer tree:

image

Here we have to change the type and the name of Entity1List, Entity1, Identifier1 and Message. They should be: ProductList, Product, ProductId (Int32) and ProductName (String). The change is done in Properties window (F4) and we should change the Name and Type Name property. When changing the Type Name you have to choose the entities from "Current Project" tab. For collections (such as ProductList) you should select the entity and check "Is Enumerable" list.

 image image

Note: when modeling the ProductId, you also have to specify that that property maps to the Identifier (and the entity that it refers to, i.e. Product).

image

Now, we have to change the same thing (without the collection, of course) for the ReadItem method. It should take one parameter (mapped to String) and return a Product. The good news is that we can copy and paste the Product node from ReadList into ReadItem method.

image

The underlying code class Product.cs and the service ProductService.cs have to be changed to include "hardcoded" data:

At this moment we have a workable connector that exposes products and product details, but nothing else. We will do a quick check by deploying the connector and creating a new external list in SharePoint.

image

image

image

Well done! Now we have to model the photos 🙂

Adding the Photo Entity to the model

First, we’ll add a new class to the project, with the following simple properties:

Then, we add a entity in the BDCM model canvas right-clicking it and choosing "Add / Entity":

image

image

Of course, we have to change the name and add the properties of the entity. We have to add both identifiers, the PhotoId and the ProductId. They both have to refer to the Photo entity, and in the association we will let BDC know that it will provide the value of ProductId when the association is navigated.

I have also added a ReadItem method.

image

Even if the association is necessary, you still have to model the ReadItem method in advance and add an instance of that method, which should be of SpecificFinder type. We will take 2 In parameters with the 2 identifiers of the Photo entity and we will return an instance of Photo class, with all its fields.

image image

We’ll add the association between Product and Photo entities now, right-clicking again on the BDCM canvas:

image

In the dialog, we’ll make sure that the association is correct and in this case we will only have the navigation from Product to the Photo, not the other way around. We’ll remove the extra navigation method (the last one) and we will uncheck the Foreign Key association, as the ProductIds are returned in the code for the association method in the Product class.

image

Now we have a new method called ProductToPhoto in the Product entity that returns a list of photos for that product.

image

We still have to do the "boring" stuff of mapping the return types in the BDC Explorer pane:

image

After that, we have to write the code for the ProductToPhoto method. At the moment we won’t be showing the photo yet, so we can set the BLOB array to null.

Ready to roll! Deploy the solution to SharePoint and create External Content Type Profile pages in the BDC Service Application (Central Administration). It will automatically add the related Photos to the Product in its profile page.

We have to delete and recreate the external list. Now we can go to the View Profile action and see the details of the product and its photos:

image

Reading the photos

The only thing missing is the link to see the actual photo (the BLOB content). We have to add a StreamAccessor method and a method instance.

We can’t add this method in the entity designer. We have to open the BDCM file as an XML file and then add the Method and MethodInstance nodes to it.

image image

We will add our method under the existing ReadItem method:

image

The XML snippet to insert is this one:

As you can see, we return a Stream with the data. We have two additional instance properties that specify which entity property is the MIME type and which one is the file name.

Check the mappings: both identifiers should be mapped to Photo entity and both as parameters and return values in the entities (for ReadItem method). If not, it will complain in runtime about "Expected 2 identifiers and found only 1". It took me some time to solve that one!

In our PhotoService.cs class we have to add the method that returns a Stream with the data. In my case I use a Base64 string with a small sailboat image in PNG format, using the excellent web site that allows you to encode the image into a string http://www.base64-image.de/step-1.php. I use the Convert .NET class to convert that string into the original array of bytes. (In this snippet I have shortened the string for legibility):
 

Deploy again to SharePoint, rebuild the external content type profile pages and it’s done!
 
image
image
 
The complete code for this example is available on my SkyDrive.

Building a Document Routing Hierarchy with SharePoint Subsites and Content Organizer

In this occasion I have been exploring the possibility of an auto-organizing document hierarchy in SharePoint 2010, made with Content Organizer functionality. As you may recall, the Content Organizer allows the documents to be routed according to rules that use metadata to decide where the document should go. This greatly enhances the usability of a document repository in SharePoint, as the end users don’t have to know where exactly should the document be uploaded. By leveraging the content organizer rules, we automate the exact logistic organization of the documents and significantly lower the possibility of incorrect classification.

Content Organizer out-of-the-box

The straightforward Content Organizer works great when you cope with different document libraries on a single site. You get one “Drop Off Library” where you should upload the documents to. Once uploaded there, the documents will be routed to the right document library and optionally inside a specific folder.

The user interface for dropping a document in Content Organizer notifies you that the document will be routed:

image

As you can see in the Content Organizer Rule editor, we only get local site lists and libraries as the destination option:

image

What happens when you have a hierarchy that spans multiple subsites that all share the same base content type but are strictly separated in different subsites for security reasons? Well, in this case you have to tweak the Content Organizer a bit to accommodate the subsites.

Routing Documents to a different site

In order to allow a content organizer to route a document to a different site, you have to create a “Send To” connection in the Central Administration. Go to “General Application Settings”, then choose “Configure send to connections” in “External Service Connection” section. In this page you will have to add the absolute path of the content organizer service of the site that you wish to route the document to. The URL is always the same:

  •  Site URL followed by /_vti_bin/officialfile.asmx

In this example, there is a subsite called “Global” and the “send to” connection" called Global is created. Please remember that the Send To Connections configuration is stored for each web application, so make sure that you are changing it for the right web application.

image

Once you have the Send To connection registered in Central Administration, you have to change two things in the site that you wish to be the entry point to the system. Go to “Site Settings”, “Content Organizer Settings” and make sure that the checkbox “Sending to Another Site” is set.

SNAGHTML1c7e8ab7

Now you can go to “Site Settings”, “Content Organizer Rules” of the site and create a rule that can target another site.

image

There is one limitation to this approach: you can target a different site but you can’t target a specific document library on that site. The document will be routed to the Content Organizer on that site and its rules will be enforced. So, in order to overcome this limitation you have to add a rule on the destination site that will route the document into a specific document library.

As it begins to become a little tricky to explain in words, I’ll draw a quick diagram to explain how my system works:

image

I add the routing rules to the Root Site that will send the newly uploaded document to the correct site, according to a Type column (in my case it’s a Managed Metadata column but it could be any type of column that can be compared to). When the document arrives to the Content Organizer on the destination site, I put two simple rules there:

  • If the Type of the document is the correct one, I move the document to the corresponding document library
  • If the Type of the document is not the correct one, I route the document again to the Root Site

The purpose of this loop is to minimize the number of rules for the correct classification. If the user uploads a Sales document on a HR site, I’d have to write the rule that moves it to the Sales site. By keeping all the routing logic for the different site at the Root Site level, I just have to send the document to the root site in order to get classified correctly.

Note: this setup can cause infinite loop if you mess with the rules and conditions, so please double-check them.

PowerShell to the Rescue

So, we have seen how to organize a multiple-site hierarchy with the Content Organizer feature. I admit that the only boring thing in the whole process is the act of building the “Send To” connections by hand. I have created a tiny PowerShell script that will do that for you. It will parse the given web application URL, iterate over all the sites in the site collections and then will add the sites with active Content Organizer to the “Send To” connections.

image

STP2WSP File Converter, Part 1: The Anatomy of A .STP

Recently, together with my colleague Martin Schmidt, I gave a session on MOSS 2007 to SharePoint 2010 migration. Among other things I mentioned that the site template .STP files are no longer supported in SharePoint 2010. STP files are still supported for list templates, though.

From my professional experience I know that there are many people who have a lot of site templates in .STP format and they don’t want to lose their work. Microsoft suggests that the right way is to create a site with each template, upgrade to SP2010 and then save as WSP template in the new version. It’s an overkill, if you ask me.

My goal is to create a converter written in .NET that would crack open an STP site template and write a shiny new WSP file with the same structure. You will follow my journey in the following days.

An .STP Site Template

Let’s create a normal site, with blank site template. I will add a document library and a webpart to expose the library on the default site page.

image

Now, save it as a STP file by going to Site Settings / Save Site as Template option.

image

The template is now safe in the Site Template Gallery in the root of the site collection.

image

Prying the Lid Off

Let’s download the STP file and extract its contents with the Microsoft Cabinet SDK Tools.

image

In this case it’s only a single manifest.xml file. A quick inspection reveals that it has the site template header and a site definition metadata.

image

Opening SharePoint Manager 2007 to inspect the source web raw properties, we see that there is a clear mapping between the <MetaKey> tags and the site property bag:

image

Furthermore, the manifest.xml file keeps reference to the original site definition (in this case, “Blank site”) from which the original site is created.

STP Manifest.xml 12TEMPLATE1033XMLWEBTEMP.XML
image image
Note the TemplateID and Configuration attributes Note the ID attribute of the Template tag and the ID attribute of the Configuration

I’ll keep investigating. The rest of the story, in Part 2, soon.

[Road to SharePoint 2010] Getting Ready for 64-bit World

Welcome to the second installment of the Road to SharePoint 2010 series. I will tackle the oldest news for this SharePoint version: it will be released only as a 64-bit installation. So, how do we prepare for this requirement? Well, by getting ready with our shiny new 64-bit servers and virtual machines.

image 
(check the original scan from a 1983 magazine here)

The hardware and software requirements were outlined on the official SharePoint team blog.

Which hardware it will need?

64-bit hardware has been recommended for SharePoint 2007 for some time. Testing data shows that SharePoint hugely benefit from improved memory management in 64-bit environment. SQL Server is also memory-intensive application, so it will be grateful for the extended memory space. Roughly, the database will benefit the most, following by the front-end web servers, and the application servers will benefit the least.

Which Windows it will need?

SharePoint 2010 will run on Windows Server 2008 / 2008 R2 x64.

Which SQL Server it will need?

SharePoint will need SQL Server 2005 or 2008 / 2008 R2 x64.

What for the developer Virtual Machines?

You are strongly recommended to have a 64-bit host OS for 64-bit virtual machines, although you still can run a 64-bit VM in a 32-bit host OS. If you are using 32-bit OS for the host, you must have a 64-bit capable processor (check here) and activate the 64-bit virtual extensions in the BIOS (called Intel VT or AMD-V, depending on the CPU maker).

You can use the Sun Virtual Box software or VMWare to create the virtual machine. Unfortunately, Virtual Microsoft PC cannot accept 64-bit client OS in the VM.

Do I have to worry about my custom .NET code in SharePoint? How can I port it to 64-bits?

You will have to recompile your source code in 64 bits configuration (or in AnyCPU build configuration). The 64-bit NET cannot load a 32-bit-only assembly.

If you don’t have the source code, as with IFilters or third-party extensions, you will have to obtain the 64-bit version, if available. If not, you will have to remove that component from the farm.

As for the NET Framework in 64 bits, there are useful references at MSDN to help us migrate our existing code. In short, if you stick to the managed code, you shouldn’t have problems. The problems arise when you use platform-dependent (alias P-Invoke) operations that cross the managed code boundaries and access the Windows directly. There’s a known issue with int data type pointers coming from COM or P-Invoke layer, you should use IntPtr data type instead.

If you want the gory details of the 32-to-64-bits issues, Scott Hanselman has a nice blog entry about it.

How to migrate my SharePoint farm from 32 to 64 bits?

The best way is to do it gradually.First you move the database tier, then the application servers and finally the web servers, as illustrated on this figure. Do not mix 32 and 64 bit servers on the same tier.

image

The recommended steps are outlined in this TechNet article:  http://technet.microsoft.com/en-us/library/dd622865.aspx

.NET Framework 4.0 and Visual Studio 2010 Training Kit

I’m still digesting the changes in .NET 3.5 and SP1, and the guys from Redmond already make a training kit for the next version of the framework. It’s impossible to follow this pace of changes 🙁

The Visual Studio 2010 and .NET Framework 4.0 Training Kit includes presentations, hands-on labs, and demos. This content is designed to help you learn how to utilize the Visual Studio 2010 features and a variety of framework technologies including: C# 4.0, Visual Basic 10, F#, Parallel Computing Platform, WCF, WF, WPF, ASP.NET AJAX 4.0, ASP.NET MVC Dynamic Data.

Get it from http://www.microsoft.com/downloads/details.aspx?FamilyID=752cb725-969b-4732-a383-ed5740f02e93&displaylang=en

How To Expose Files in a Shared Folder for Remote Access with SharePoint

I just completed a Proof-of-Concept setup to allow the users to remotely access files that sit on a network share, using SharePoint and IIS 6.

I have a shared folder in \SHARE2007Shared and I wanted it to be accessible from http://share2007 (the SharePoint portal site).

The outline of the setup is like this:

  • Add a new File Share Content Source to SharePoint (MOSS) search settings, pointed at \SHARE2007Shared

    image
    image

  • Run a full crawl and check that it indexes the files correctly, searching for a file. Notice that the URL of the file is pointing at the file share directly.

    image

  • Open the IIS console and add a new virtual directory within the SharePoint portal site, pointing it at \SHARE2007Shared. Give it any name that suits you. (I chose “uncshares”). I use a fixed identity to access the file share, because SharePoint crawler will perform the security checking for the current user and I avoid identity delegation to the file share.

    image
    image
    image

  • Add a new Server Name Mapping in search settings, replacing file://share2007/shared references to http://share2007/uncshares
    image
  • Run a full crawl, again
  • Search for a file from SharePoint portal. You should notice that the displayed URL is pointing to the IIS virtual directory, not to the file share directly.

    image

New SharePoint Training Content

If you are comfortable in .NET development but SharePoint has always been a murky water for you, check out the new MSDN content tailored especially for wannabe SharePoint developers.

It contains whitepapers, screencast, hands-on labs, virtual labs, demos and presentations to get you into SharePoint programming world in no time.

http://mssharepointdeveloper.com

image

image