Skip to content

Commit 275762c

Browse files
author
Yoichi Kawasaki
authored
Merge pull request #5 from alexandreweiss/sampleNginx
added csv and type sample under nginx
2 parents 5ecf10e + 6e29be3 commit 275762c

File tree

3 files changed

+80
-3
lines changed

3 files changed

+80
-3
lines changed

README.md

Lines changed: 57 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
6969
fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default if **add_time_field** and **add_tag_field** are true respectively. Below are two types of the plugin configurations - Default and All options configuration.
7070

7171
### (1) Default Configuration (No options)
72-
<u>fluent.conf</u>
72+
<u>fluent_1.conf</u>
7373
```
7474
<source>
7575
@type tail # input plugin
@@ -88,7 +88,7 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
8888
```
8989

9090
### (2) Configuration with All Options
91-
<u>fluent.conf</u>
91+
<u>fluent_2.conf</u>
9292
```
9393
<source>
9494
@type tail # input plugin
@@ -114,7 +114,7 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
114114
### (3) Configuration with Typecast filter
115115

116116
You want to add typecast filter when you want to cast fields type. The filed type of code and size are cast by typecast filter.
117-
<u>fluent.conf</u>
117+
<u>fluent_typecast.conf</u>
118118
```
119119
<source>
120120
@type tail # input plugin
@@ -146,6 +146,50 @@ You want to add typecast filter when you want to cast fields type. The filed typ
146146
```
147147
gem install fluent-plugin-filter_typecast
148148
```
149+
### (4) Configuration with CSV format as input and specific field type as output
150+
You want to send to Log Analytics, logs generated with known delimiter (like comma, semi-colon) then you can use the csv format of fluentd and the keys/types properties.
151+
This can be used with any log, here implemented with Nginx custom log.
152+
<u>fluent_csv.conf</u>
153+
154+
Suppose your log is formated the way below in the /etc/nginx/conf.d/log.conf:
155+
```
156+
log_format appcustomlog '"$time_iso8601";"$hostname";$bytes_sent;$request_time;$upstream_response_length;$upstream_response_time;$content_length;"$remote_addr";$status;"$host";"$request";"$http_user_agent"';
157+
```
158+
And this log is activated throught the /etc/nginx/conf.d/virtualhost.conf :
159+
```
160+
server {
161+
...
162+
access_log /var/log/nginx/access.log appcustomlog;
163+
...
164+
}
165+
```
166+
You can use the following configuration for the source to tail the log file and format it with proper field type.
167+
```
168+
<source>
169+
@type tail
170+
path /var/log/nginx/access.log
171+
pos_file /var/log/td-agent/access.log.pos
172+
tag nginx.accesslog
173+
format csv
174+
delimiter ;
175+
keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
176+
types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
177+
time_key time
178+
time_format %FT%T%z
179+
</source>
180+
181+
<match nginx.accesslog>
182+
@type azure-loganalytics
183+
customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
184+
shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
185+
log_type NginxAcessLog
186+
time_generated_field time
187+
time_format %FT%T%z
188+
add_tag_field true
189+
tag_field_name mytag
190+
</match>
191+
```
192+
149193

150194
## Sample inputs and expected records
151195

@@ -162,6 +206,16 @@ The output record for sample input can be seen at Log Analytics portal like this
162206

163207
![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image.png)
164208

209+
<u>Sample Input (nginx custom access log)</u>
210+
```
211+
"2017-12-13T11:31:59+00:00";"nginx0001";21381;0.238;20882;0.178;-;"193.192.35.178";200;"mynginx.domain.com";"GET /mysite/picture.jpeg HTTP/1.1";"Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/63.0.3239.84 Safari/537.36"
212+
```
213+
214+
<u>Output Record</u>
215+
216+
Part of the output record for sample input can be seen at Log Analytics portal like this with field of type _s (string) or _d (double):
217+
218+
![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image-2.png)
165219

166220
## Tests
167221
### Running test code (using System rake)

examples/fluent_csv.conf

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
<source>
2+
@type tail # input plugin
3+
path /var/log/nginx/access.log # monitoring file
4+
pos_file /var/log/td-agent/access.log.pos # position file
5+
format csv # format
6+
tag nginx.accesslog # tag
7+
delimiter ; # record delimiter used in source log
8+
keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
9+
types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
10+
time_key time
11+
time_format %FT%T%z
12+
</source>
13+
14+
<match nginx.accesslog>
15+
@type azure-loganalytics
16+
customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
17+
shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
18+
log_type EVENT_TYPE_NAME # The name of the event type. ex) NginxAcessLog
19+
time_generated_field time
20+
time_format %FT%T%z
21+
add_tag_field true
22+
tag_field_name mytag
23+
</match>
59.6 KB
Loading

0 commit comments

Comments
 (0)