The User Connection
Capture information about connected users with these 4 scripts
March 13, 2005
How many times have you found yourself in the following situation? You're consolidating, migrating, or retiring servers and need to relocate shares (or take them offline). You send out several Net Send notifications to users who connect to the servers, and you post announcements on the corporate intranet. Everything seems to be going smoothly until a few users—probably managers—complain that they didn't know you were going to be moving their data. Or you discover broken applications that were attached to the affected shares through domain service accounts and that were dependent on now-deleted data.
As companies become more and more dependent on computing infrastructures, projecting the impact of a planned server outage—or quantifying that impact following an unplanned outage—has become an important administrative chore. To help you simplify the task, I've written a four-script suite that you can use to capture connected-user metrics, user IDs, and information about the files that users or applications are accessing. These scripts give you a variety of options for creating reports that you then can analyze to get an accurate picture of your file servers' use and users. With the collected user metrics and accessed-file paths in hand, you can better inform users of server-environment changes and help management predict the impact of a downed server. Let's take a look at the scripts and how you can make them work for you.
First Things First
A primary goal of each of these scripts is to identify the network users and applications that are connecting to your target server (i.e., the server you're planning to migrate or retire). The most useful tool I know of for enumerating user connections is Sysinternals' PsLoggedOn tool (part of the PsTools suite). You can use PsLoggedOn to show local and remote connections to a local or remote machine. Three of our scripts use the tool with the syntax
psloggedon.exe \computername
where computername is the name of a remote machine.
Because you're trying to determine network-based connections, you need to filter out local connections. When you're looking at hundreds or thousands of user connections, leaving local connections in your results probably won't significantly skew your statistics. However, suppressing these connections provides a clearer and more accurate picture of domain service accounts—which can point out application dependencies. Unfortunately, PsLoggedOn doesn't have a switch that provides such a filter. Therefore, the scripts use the following chain of Find commands to exclude from the results any connections that don't contain forward slashes or the computer name:
psloggedon.exe \computername | find "/" | find /I /V "computername"
This approach eliminates connections that don't have a time/date stamp, as well as connections made by local-machine service accounts. If you want to see local service or user account connections, simply remove the second Find command:
psloggedon.exe \computername | find "/"
Next, let's go over the scripts. Each one offers a slightly different perspective of your target server's connected users and accessed shares. You'll choose which script or scripts will give you the information you need.
The Four Scripts
The GetSessionMetrics.bat script, which Listing 1 shows, uses the basic PsLoggedOn syntax to capture only the number of connected users, without logging the user IDs. The Set command at callout A in Listing 1 increments a counter that will total the number of lines that the PsLoggedOn command returns but that will ignore the user IDs. This user count provides a snapshot view of the number of connected users at a given point in time.
I suggest that you use Task Scheduler to capture this information at several points during the business day because a user community's connection-time-and-date patterns will vary; an average of user connections will give you the most meaningful statistics. Try to incorporate known user-access times or dates in your scheduling. For example, if accounting data is involved and you know that the accounting department's work cycle centers on certain days each month, be sure to include those days in your scheduling. Your results won't necessarily reflect the use of a file server's shared folders but will alert you to mapped drives or open folders or files. If you want to run a script several times during the day, be sure to first test each script for its average run time, then adjust your repeat interval accordingly. Leave a generous time lapse over and above the average observed run time. If a script takes longer to run than the time interval you've allowed, it could overlap the next run time. For example, if a script takes 1 hour to run and you set up Task Scheduler to run the script every 30 minutes, Task Scheduler's subsequent attempts to launch the script will fail until the first script instance has completed. Task Scheduler will never start a second instance of a scheduled task while a previous run is in progress. This Task Scheduler safety feature is good news because running two instances of a script such as GetSessionMetrics.bat, which writes to one output file, could corrupt your results. Task Scheduler's behavior will prevent output-file corruption, but overlaps in scheduling could prevent certain intervals' results from being recorded.
The next script, GetUniqueSessionUserIDs.bat, determines not only the number of user connections but also who those users actually are. GetUniqueSessionUserIDs.bat, which Listing 2 shows, captures the user IDs that GetSessionMetrics.bat ignores and sends those IDs to a spreadsheet report. Again, I suggest you run this script several times throughout the day to get a good average of results. Although good for obtaining averages, running a script repeatedly can cause problems: Because each run's data is appended to the spreadsheet report, you'll soon have a long report that contains many duplicate names. You could let the report run for a specified period, then remove the duplicate user IDs by opening the report in Microsoft Excel and performing a manual sort followed by a filter operation. Instead, I've included code (see callout A in Listing 2) that will automatically handle the sorting and filtering each time the script runs. The result is a report that contains a list of unique user IDs in alphabetical order. Each subsequent script run will add only new IDs to the report.
The GetUniqueSessionsIDsWithNames.bat script, which Listing 3 shows, captures more detailed user information than just the user ID. Often, I'll knock myself out writing a complicated script to capture group memberships or other lists of users IDs—only to be told that management wants to see results that provide users' friendly names (e.g., fredsmith, sallylee). If your organization has a scheme in place to assign friendly names to domain user accounts, this type of request might be fairly easy to deal with. However, in many environments, user IDs are somewhat cryptic, human resources (HR)assigned codes or other less-intuitive identifiers that don't reveal much about a user's true identity. In case you're in this type of environment, the code at callout B in Listing 3 uses JoeWare's GetUserInfo utility to obtain the display name or full name and description information for the user IDs that are listed in the ID report that GetUniqueSessionUserIDs.bat produces. Having this detailed user information also makes user contact and notification much easier.
The final user-connection script in our suite is GetOpenFiles.bat. Listing 4 shows an excerpt from this script. When you want to determine usage patterns or which users might be affected when you migrate or retire a server, knowing which paths users are accessing can be valuable. This information can also help you more accurately identify whether files within a share are being accessed or whether the share is indeed inactive. Trying to determine this by viewing the last-accessed dates in Windows can be a bit deceiving because just the act of selecting a file or folder can trigger a date change. Logging actual file access gives you a much more accurate picture of which shares are truly active.
To capture accessed folder or file paths and the user IDs of the accounts that accessed the files, GetOpenFiles.bat uses Sysinternals' PsFile tool (also a member of the PsTools suite) with some chained Find commands:
psfile.exe \computername | find /V "Locks:" | find /V "Access:"
PsFile's output has one big drawback: It lists the file path and user output on separate lines. Therefore, you have to use separate For commands to capture the path and user information, then get that information back together before you can write it to the log file. The code at callout B in Listing 4 accomplishes this "variable shuffle."
If you're migrating selected shares on a server (rather than migrating the entire server), this script will help you better identify the users who access those locations. The output from this script will be fairly verbose; if your server has hundreds of users, there might be thousands of open files and folders. Therefore, the script doesn't perform user-information queries but simply outputs the results to a report.
One of the challenges of sorting and manipulating text strings in command-shell scripting is dealing with reserved characters. If your file-path strings contain any reserved characters, such as an ampersand (&) or an opening or closing parenthesis, the reserved character will be interpreted as part of your script code and will cause unusual errors. The best way to solve this problem is to use double quotes to encapsulate the suspect string; doing so will keep any reserved characters from being misinterpreted. The code at callout A in Listing 4 performs a sort-and-filter operation (similar to the operations that remove duplicates in the GetUniqueSessionUserIDs.bat and GetUniqueSessionsIDsWithNames.bat scripts) that handles these character problems and keeps double quotes around any file-path strings as they're sorted and output.
How To Use the Scripts
I've tested these scripts on Windows Server 2003, Windows XP Professional Service Pack 1 (SP1), Windows 2000 Server SP4, and Win2K Pro. To begin using them, follow these steps:
Download PsLoggedOn and PsFile (http://www.sysinternals.com); download GetUserInfo (http://www.joeware.net/win/free/tools/getuserinfo.htm). Some of these utilities are sensitive to spaces in the file path. Place them in a folder that has no spaces in its path. You might consider placing them in a server-located, shared folder for easier access.
Download the four user-connection scripts from http://www.windowsitpro.com/windowsscripting (enter 45505 in the site's InstantDoc ID box, then click the 45505.zip file link at the top of the online article). Copy all the scripts into one folder on the server or PC from which you're going to run your queries. (Be aware that column widths in the printed publication force us to wrap code lines in the printed listings, which might cause the code to run incorrectly. Also, some of these scripts incorporate key spaces, tabs, and other details that aren't obvious in the code unless you download it according to these instructions.)
Edit the first line or two in each script (e.g., the code at callout A in Listing 3) to point to the PsTools and JoeWare utilities.
The scripts are self locating and will look for a servers.txt file in the same folder as the scripts. This file should contain a list of your target servers, with one server name on each line.
Each script automatically creates a Logs folder the first time it runs and places all its logs into that folder. These logs are tab-separated value (.tsv) files, named with the target server name.
After you've tested the scripts and have decided which script or scripts give you the information that's most helpful for your environment, use Task Scheduler to run your choices regularly. (I run mine about every 2 hours during the business day.
The four scripts I've provided here should help make server and share migrations smoother and community-share use easier to access. Try them out in a test environment and pick the scripts that will give you the information that you (and your organization's management) need.
About the Author
You May Also Like