AWS S3Beat With CloudFlare
Below steps is to configure Logpush for Account Scoped Data to push to S3
On homepage, Configure Account Activity related logs for the logpush, click on “Logpush” option under “Analytics & Logs”
In Logpush window click on “Create a Logpush Job”
Select “Amazon S3” for Logpush to push logs to S3.
Configure the Destination Location(Bucket name and Region)
Once the bucket name is put in the destination details. The policy json is populated itself which needs to be copied and put in the required “Bucket policy” for S3 bucket.
Add the required policy json for the given Bucket and click on “Save Changes”
Once we save bucket policy then come back to the cloudflare page where we were configuring the Logpush and click on the Continue button.
After the above steps cloudflare will send a test file containing token to S3 bucket confirm the ownership.
Copy the contents of the “ownership-challenge” file and paste it in cloudflare configuration page.
10. Once the ownership is confirmed, then we need to choose the required dataset. Which for account will be Audit Logs
11. After selecting the Audit logs, we need to configure the job now. The configuration will require the Job name, whether all the jobs is required or any filtering, which data field, date format etc.
12. Once above configuration is saved our Logpush Job is ready
13. In order to validate whether the log is being sent to S3 bucket we can generate any Account scoped event. For example we will create a worker(serverless function in Cloudflare terminology). For that Click on the Workers & Pages option on the left side menu.
14. Click on “Create Button” on the screen below.
15. Click on the Create Worker button
16. Click on the Deploy button
17. Once deployed we can see some logs being generated in the S3 bucket.