一般我们都是利用WebRequest这个类来向服务器进行数据的POST,不过很多情况下相应的服务器都有验证,看你是不是登陆,是不是来自同一个域,这些都简单,我们可以更改其属性来达到欺骗服务器。不过如果服务器做了CSRF控制,那我们怎么办?
不熟悉CSRF的可以问下G哥此为何物,这里简单介绍下。CSRF常规来讲是在表单页里放一个隐藏域,然后在表单提交的时候服务器验证POST过来的NAVEVALUE里面是不是包含此域,同时如果包含验证其值。
问题来了,在这种情况下我们POST到服务器的数据怎么写,虽然我们可以查看HTML来得知这个NAME是什么以及它的VALUE是什么,但是这个VALUE一般情况下每刷一次都是会发生变化的。那好了在我们POST的时候怎么来得到它呢?
网上常见的那些WebRequest方法肯定不行,因为它们都是用这个类先获得一个Stream,在这个Stream里面写入我们要POST到服务器的数据,可这个时候我们还不知道这个CSRF的值呢,POST过去肯定出错。理论上来讲我们要先GET一次,然后自己办法解析GET到的这个HTML,得到CSRF的值,可是接下来我们再去WebRequest.Creat打算去POST的时候,此时相当于又重新访问了一遍,它的CSRF值已经变了,看来此路不通啊。
好在我们还有WebClient可以利用,WebClient可以让我们保持一个实例即可,而WebRequest只有通过静态方法创造出来,不能通过变化URL来达到使用同一个的目的,此处可能也是在NET4里微软推出全新HttpClient的目的,用来一统HTTP访问接口的江湖。
好了,我们现在需要做的就是继承WebClient,重写相应方法,代码如下:
1 public class CookieAwareWebClient : WebClient
2 {
3 public string Method;
4 public CookieContainer CookieContainer { get; set; }
5 public Uri Uri { get; set; }
6
7 public CookieAwareWebClient()
8 : this(new CookieContainer())
9 {
10 }
11
12 public CookieAwareWebClient(CookieContainer cookies)
13 {
14 this.CookieContainer = cookies;
15 this.Encoding = Encoding.UTF8;
16 }
17
18 protected override WebRequest GetWebRequest(Uri address)
19 {
20 WebRequest request = base.GetWebRequest(address);
21 if (request is HttpWebRequest)
22 {
23 (request as HttpWebRequest).CookieContainer = this.CookieContainer;
24 (request as HttpWebRequest).ServicePoint.Expect100Continue = false;
25 (request as HttpWebRequest).UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.5 Safari/537.36";
26 (request as HttpWebRequest).Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
27 (request as HttpWebRequest).Headers.Add(HttpRequestHeader.AcceptLanguage, "zh-CN,zh;q=0.8,en;q=0.6,nl;q=0.4,zh-TW;q=0.2");
28 (request as HttpWebRequest).Referer = "some url";
29 (request as HttpWebRequest).KeepAlive = true;
30 (request as HttpWebRequest).AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
31 if (Method == "POST")
32 {
33 (request as HttpWebRequest).ContentType = "application/x-www-form-urlencoded";
34 }
35 }
36 HttpWebRequest httpRequest = (HttpWebRequest)request;
37 httpRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
38 return httpRequest;
39 }
40
41 protected override WebResponse GetWebResponse(WebRequest request)
42 {
43 WebResponse response = base.GetWebResponse(request);
44 String setCookieHeader = response.Headers[HttpResponseHeader.SetCookie];
45
46 if (setCookieHeader != null)
47 {
48 //do something if needed to parse out the cookie.
49 try
50 {
51 if (setCookieHeader != null)
52 {
53 Cookie cookie = new Cookie();
54 cookie.Domain = request.RequestUri.Host;
55 this.CookieContainer.Add(cookie);
56 }
57 }
58 catch (Exception)
59 {
60
61 }
62 }
63 return response;
64 }
65 }
可以看出,其实最关键的还是利用好CookieContainer这个类。接下来就是如何使用了,我们需要首先访问一次登陆页面,拿到HTML然后正则也好替换也好,拿到这个CSRF的VALUE,然后再将其POST相应的服务器。
1 var cookieJar = new CookieContainer();
2 CookieAwareWebClient client = new CookieAwareWebClient(cookieJar);
3
4 // the website sets some cookie that is needed for login, and as well the ‘lt‘ is always different
5 string response = client.DownloadString("url for get");
6 string regx = "<input type=\"hidden\" id=\"lt\" name=\"lt\" value=\"(?<PID>\\S+?)\" />";
7 // parse the ‘lt‘ and cookie is auto handled by the cookieContainer
8 string token = Regex.Match(response, regx).Groups[1].Value;
9 string urlforlogin = "url for login";
10 string postData =
11 string.Format("username={0}&password={1}<={2}", "user", "pass", token);
12 client.Method = "POST";
13 response = client.UploadString("url for login", postData);
14
15 client.Method = "GET";
到此我们就可以结束,后期就是变化不同的URL去DownloadString了,俗称爬虫,接下来就可以根据不同的业务做不同的数据分析了。