Home
Map
WebClient ExamplesUse WebClient to download files. Specify HTTP headers and handle strings and byte arrays.
C#
This page was last reviewed on Oct 13, 2023.
WebClient. This C# class downloads files. Found in the System.Net namespace, it downloads web pages and files. WebClient is powerful.
Class notes. WebClient is versatile. It makes it possible to easily download web pages for testing. We often use it in a using-statement.
using
HttpClient
First example. Make sure to include the System.Net namespace. This example creates a new WebClient object instance and sets its user agent.
Then This WebClient will download a page and the server will think it is Internet Explorer 6. It gets a byte array of data.
byte Array
Tip You can add a new HTTP header to your WebClient download request by assigning an entry in the Headers collection.
Also You can use the WebHeaderCollection returned by Headers and call the Add, Remove, Set and Count methods on it.
Info Internally, the DownloadData method will allocate the bytes on the managed heap.
using System; using System.Net; // Create web client simulating IE6. using (WebClient client = new WebClient()) { client.Headers["User-Agent"] = "Mozilla/4.0 (Compatible; Windows NT 5.1; MSIE 6.0)"; // Download data. byte[] arr = client.DownloadData("http://www.example.com/"); // Write values. Console.WriteLine(arr.Length); }
1256
Example 2. This example uses two HTTP request headers set on the Headers collection on WebClient. It then reads in the ResponseHeaders collection.
Detail To set many request headers, simply assign the string keys to the string values you want the headers to be set to.
Detail This part of the example gets a response HTTP header using the client.ResponseHeaders collection.
Tip You can access this much like a hash table or dictionary. If there is no header set for that key, the result is null.
using System; using System.Net; class Program { static void Main() { // Create web client. WebClient client = new WebClient(); // Set user agent and also accept-encoding headers. client.Headers["User-Agent"] = "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"; client.Headers["Accept-Encoding"] = "gzip"; // Download data. byte[] arr = client.DownloadData("http://www.dotnetperls.com/"); // Get response header. string contentEncoding = client.ResponseHeaders["Content-Encoding"]; // Write values. Console.WriteLine("--- WebClient result ---"); Console.WriteLine(arr.Length); Console.WriteLine(contentEncoding); } }
--- WebClient result --- 2040 gzip
Example 3. Next, we download a web page from the Internet into a string. We create a WebClient and then specify the URL we want to download as the parameter to the DownloadString method.
Note If no accept-encoding was specified, the server usually returns a plain text string.
Info Internally, the DownloadString method will call into lower-level system routines in the Windows network stack.
And It will allocate the resulting string on the managed heap. Then it will return a value referencing that data.
using System; using System.Net; class Program { static void Main() { // Create web client. WebClient client = new WebClient(); // Download string. string value = client.DownloadString("http://www.dotnetperls.com/"); // Write values. Console.WriteLine("--- WebClient result ---"); Console.WriteLine(value.Length); Console.WriteLine(value); } }
Console program. This console program receives the target URL you want to download, and the local file you want to append to. If the local file is not found, it will be created.
Info If your website exposes debugging information at a certain URL, you can configure this program to download that data and log it.
Also It is possible to use this program on a timer or invoke the program through other programs, with the Process.Start method.
Process
Tip You can write a console program that accesses a specific URL and then stores it in a log file.
using System; using System.IO; using System.Net; class Program { static void Main(string[] args) { try { Console.WriteLine("*** Log Append Tool ***"); Console.WriteLine(" Specify file to download, log file"); Console.WriteLine("Downloading: {0}", args[0]); Console.WriteLine("Appending: {0}", args[1]); // Download url. using (WebClient client = new WebClient()) { string value = client.DownloadString(args[0]); // Append url. File.AppendAllText(args[1], string.Format("--- {0} ---\n", DateTime.Now) + value); } } finally { Console.WriteLine("[Done]"); } } }
Timing downloads. This program implements a console application that allows you to time a certain web page at any URL. It downloads the web page a certain number of times.
Note The program downloads the page up to 100 times. It averages the total milliseconds elapsed and prints this to the screen.
using System; using System.Diagnostics; using System.Net; class Program { const int _max = 5; static void Main(string[] args) { try { // Get url. string url = args[0]; // Report url. Console.ForegroundColor = ConsoleColor.White; Console.WriteLine("... PageTimeTest: times web pages"); Console.ResetColor(); Console.WriteLine("Testing: {0}", url); // Fetch page. using (WebClient client = new WebClient()) { // Set gzip. client.Headers["Accept-Encoding"] = "gzip"; // Download. // ... Do an initial run to prime the cache. byte[] data = client.DownloadData(url); // Start timing. Stopwatch stopwatch = Stopwatch.StartNew(); // Iterate. for (int i = 0; i < Math.Min(100, _max); i++) { data = client.DownloadData(url); } // Stop timing. stopwatch.Stop(); // Report times. Console.WriteLine("Time required: {0} ms", stopwatch.Elapsed.TotalMilliseconds); Console.WriteLine("Time per page: {0} ms", stopwatch.Elapsed.TotalMilliseconds / _max); } } catch (Exception ex) { Console.WriteLine(ex.ToString()); } finally { Console.WriteLine("[Done]"); } } }
Headers. You can set the request HTTP headers. You can do this with the Headers get accessor, or the Headers variable as a WebHeaderCollection.
Response headers. You can access the response HTTP headers after you invoke DownloadData or DownloadString. Headers are found in ResponseHeaders.
Threads. It is possible to access web pages on separate threads. The WebClient class provides OpenReadAsync, DownloadDataAsync, DownloadFileAsync and DownloadStringAsync methods.
Note These allow you to continue running the present method while the download has not completed, and they return void.
Dispose. The WebClient class holds onto some system resources which are required to access the network stack in Microsoft Windows. These resources are eventually cleaned up.
However If you manually call Dispose or use the using-statement, you can make these resources be cleaned up at more predictable times.
Summary. We used the WebClient class in the System.Net namespace. This class allows us to download web pages into strings and byte arrays. It will fetch external resources.
Dot Net Perls is a collection of tested code examples. Pages are continually updated to stay current, with code correctness a top priority.
Sam Allen is passionate about computer languages. In the past, his work has been recommended by Apple and Microsoft and he has studied computers at a selective university in the United States.
This page was last updated on Oct 13, 2023 (edit).
Home
Changes
© 2007-2024 Sam Allen.