Google SafeSearch API, Google Locations, and more.

A short post this week. I have been messing with the google API and Powershell. The google APIs tend to follow a pattern so if nothing else I hope these functions work as a solid example. That said I like to make sure stuff is at least a little useful outside of examples.

My favorite is the safe browsing api. Google constantly monitors sites for Malware, Social Engineering and other attacks. This is the same system that warns Google users before clicking a link on a search. Some uses for the function would be to watch domains you own being flagged either correctly or a incorrectly. I was also kicking around the idea of doing an Active directory DNS log parser that caches the queries and compares them to the safe browsing database. This might catch users going with unsafe behavior and the like in a corporate environment.

You can find the function on pastebin here or all the scripts are on my github

Safe browsing function:
#Always fails safe search: malware.testing.google.test/testing/malware/
function get-BrowseSafe() {
Param(
[Parameter(Mandatory = $true)][string]$search,
[AllowEmptyString()]$api_key=""
)
#build the json object to send to google:
$json = '{
"client": {
"clientId": "BrowseSafe_monitor",
"clientVersion": "1"
},
"threatInfo": {
"threatTypes": ["MALWARE", "SOCIAL_ENGINEERING"],
"platformTypes": ["ANY_PLATFORM" ],
"threatEntryTypes": ["URL"],
"threatEntries": [

#loop through multiple semi-colon delimited urls
$search_arr = $search -split “;”
$count_max = $search_arr.count
$count = 1
foreach($item in $search_arr) {
if($count -eq $count_max) {
$json = $json + “{“”url””: “”$item””}”
} else {
$json = $json + “{“”url””: “”$item””},”
}
$count++
}
#close Json
$json = $json + ‘]}}’

#build url with correct api_key
$url = “https://safebrowsing.googleapis.com/v4/threatMatches:find?key=” + $api_key
try {
$result = Invoke-RestMethod “$url” -Method POST -Body $json -ContentType ‘application/json’
$result = $result.matches
} catch {
echo $_
}

if($($result.count) -eq 0) {
return $false
} else {
return $result
}
}

Search-custom uses the google custom search API. Which allows you to setup your own set of sites and query from googles search database.  You have to build a list of sites and base the search themes off of those. Each search list has its own id that can be updated. So the function provided can be used as more of a template for search sites you come up with. Be sure to get the customsearch_id provided by google after you configure your projects search list.

You can find the function on pastebin here or all the scripts are on my github

function search-custom() {
Param(
[Parameter(Mandatory = $true)][string]$search,
[AllowEmptyString()] $customsearch_id = "<provide your own search id with google>",
[AllowEmptyString()]$api_key="<your key>"
)

try {
$Search_results = invoke-restmethod “https://www.googleapis.com/customsearch/v1?q=$search&cr=us&cx=$customsearch_id&key=$api_key”
$search_results = ($search_results.items) | select title, snippet,link
return $search_results
} catch {
return $false
}
}

Search-nearby utilizes the google locations API. This is what recommends restaurants and the like near a location. I wanted this in my powershell profile so I ended up writing a quick wrapper.

You can find the function on pastebin here or all the scripts are on my github

function search-nearby() {
Param(
[Parameter(Mandatory = $true)][string]$search,
[AllowEmptyString()]$api_key="<your key>"
)

try {
$Search_results = invoke-restmethod “https://maps.googleapis.com/maps/api/place/textsearch/json?query=$search&key=$api_key”
$search_results = ($search_results.results) | select Name,types,Formatted_address,price_level,Rating
return $search_results
} catch {
return $false
}

}

The other 2 functions for this post are youtube based. It is worth checking out the youtube api directions. I found the youtube API the most confusing, and I still havn’t spent the time to figure out how to post videos. At least these will get you started with exploring youtube.

You can find the function on pastebin here or all the scripts are on my github


function Get-youtubesearch() {
Param(
[Parameter(Mandatory = $true)][string]$search,
[AllowEmptyString()]$max_page = 5,
[AllowEmptyString()]$copyright = "any",
[AllowEmptyString()]$youtube_key=""
)

$Search_results = invoke-restmethod “https://www.googleapis.com/youtube/v3/search?part=snippet&q=$search&type=video&videoLicense=$copyright&key=$youtube_key”
$page_count = 1
$video_list = @()

while(($Search_results.nextPageToken) -and ($page_count -le $max_page)) {
$next_page=$Search_results.nextPageToken

foreach($video_info in $search_results.items) {
$video_id = $video_info.id.videoid
$video_stats = invoke-restmethod “https://www.googleapis.com/youtube/v3/videos?part=statistics&id=$video_id&key=$youtube_key”
[int]$views = $video_stats.items.statistics.viewcount
[int]$likes = $video_stats.items.statistics.likecount
[int]$dislikes = $video_stats.items.statistics.dislikeCount
$title = $video_info.snippet.title
$link = “https://youtube.com/watch?v=$video_id”

$video_list += new-object psobject -Property @{
title = “$title”;
video_id = “$video_id”;
likes = $likes;
dislikes = $dislikes;
views = “$views”;
link = “$link”;
}

}

$Search_results = invoke-restmethod “https://www.googleapis.com/youtube/v3/search?part=snippet&pageToken=$next_page&type=video&q=$search&videoLicense=$copyright&key=$youtube_key”
$page_count++
}

return $video_list
}

function get-youtubepopular() {
Param(
[AllowEmptyString()]$max_page = 5,
[AllowEmptyString()]$copyright = “any”,
[AllowEmptyString()]$youtube_key=””
)

$Search_results = invoke-restmethod “https://www.googleapis.com/youtube/v3/videos?chart=mostPopular&key=$youtube_key&part=snippet”
$page_count = 1
$video_list = @()

while(($Search_results.nextPageToken) -and ($page_count -le $max_page)) {
$next_page=$Search_results.nextPageToken

foreach($video_info in $search_results.items) {
$video_id = $video_info.id
echo “second search”
$video_stats = invoke-restmethod “https://www.googleapis.com/youtube/v3/videos?part=statistics&id=$video_id&key=$youtube_key”
[int]$views = $video_stats.items.statistics.viewcount
[int]$likes = $video_stats.items.statistics.likecount
[int]$dislikes = $video_stats.items.statistics.dislikeCount
$title = $video_info.snippet.title
$link = “https://youtube.com/watch?v=$video_id”

$video_list += new-object psobject -Property @{
title = “$title”;
video_id = “$video_id”;
likes = $likes;
dislikes = $dislikes;
views = $views;
link = “$link”;
}

}

$Search_results = invoke-restmethod “https://www.googleapis.com/youtube/v3/videos?chart=mostPopular&pageToken=$next_page&key=$youtube_key”
$page_count++
}

return $video_list
}

Moving AD Objects using Powershell, Subnets, or Active directory Sites and Services

This week I was working on a script that pulls from the Computers container and places the computer object in the OU by subnet. The goal was for simple configuration and the method I came up with for configuration was using hashtables.


$Org_List = @{"192.168.0.0/28" = "ou=site1,dc=test,dc=local";

“192.168.3.0/24” = “ou=site2,dc=test,dc=local”;

“172.16.0.0/24” = “ou=site2,dc=test,dc=local”}

#Configure

$Computer_List = get-adcomputer -filter { Enabled -eq $true } -searchbase “CN=computers,DC=test,dc=local” | select DnsHostName, DistinguishedName

The complete script for this can be found on Pastebin  or on my github.

Adding new subnets and ou’s is just a matter of modifying the existing hash table and the script parses that data and moves the computer objects.

This works well, but then a friend pointed out how he needed to re-ip a whole office floor and this particular design would then need him to go back and update it. I thought about it and realized that the correct way to handle this was to piggyback it onto another maintenance task that would need to happen anyway. So I tied the script to sites and services:


$Org_List = @{"Office1" = "ou=site1,dc=test,dc=local";

“Office2” = “ou=site2,dc=test,dc=local”}

#Configure Computer search and limitations

$Computer_List = get-adcomputer -filter { Enabled -eq $true } -searchbase “CN=computers,DC=test,dc=local” | select DnsHostName, DistinguishedName

 

I still used the hash table, but now it matches the site names in my test environment:

 

 

 

 

 

 

 

 

 

 

 

 

 

Now when an office or floor is redone the normal maintenance done in Sites and Services will make sure the script will continue to drop the computer objects in the correct OU.

The full script can be found on Pastebin  or on github.

A quick breakdown of key functions in both scripts :

Find-ipcidr

 

function find-ipcidr() {

Param( [Parameter(Mandatory = $true)]$IP_Cidr )

$Ip_Cidr = $IP_Cidr.Split("/")

$Ip_Bin = ($IP_Cidr[0] -split '\.' | ForEach-Object {[System.Convert]::ToString($_,2).PadLeft(8,'0')}).ToCharArray()

for($i=0;$i -lt $Ip_Bin.length;$i++){

if($i -ge $Ip_Cidr[1]){

$Ip_Bin[$i] = "1"

}

}

[string[]]$IP_Int = @()

for($i = 0;$i -lt $Ip_Bin.length;$i++) {

$PartIpBin += $Ip_Bin[$i]

if(($i+1)%8 -eq 0){

$PartIpBin = $PartIpBin -join ""

$IP_Int += [Convert]::ToInt32($PartIpBin -join "",2)

$PartIpBin = ""

}

}

$IP_Int = $IP_Int -join "."

return $IP_Int

}

 

find-ipcidr gets the ending IP of a cidr range. For example 192.168.0.0/24 becomes 192.168.0.255. More information on CIDR can be https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing” target=”_blank”>found here.

ip_to_int32

 

function ip_to_int32(){

Param( [Parameter(Mandatory = $true)]$IP_int32 )

$IP_int32_arr = $IP_int32.split(".")

$return_int32 = ([Convert]::ToInt32($IP_int32_arr[0])*16777216 +[Convert]::ToInt32($IP_int32_arr[1])*65536 +[Convert]::ToInt32($IP_int32_arr[2])*256 +[Convert]::ToInt32($IP_int32_arr[3]))

return $return_int32

}

 

This converts the ip into a 32 bit integer. I use this in the scripts in order to tell if an ip is within a range. The code:


if(($search_ip_int32 -ge $start_of_range_int32) -and ($search_ip_int32 -le $end_of_range_int32))

Simply checks if the ip I am looking at is greater than / equal to the Start of the CIDR, or if it is with in the end of the CIDR. This is far better than other methods such as for loops and much simpler to work with.

The rest of the script is simply breaking up hashtable, running the above functions, or checking the range and performing an action.

Though my 2 scripts may not work perfectly in your environment, feel free to use any part or let me know how you have kept OU’s organized in the comments.

Thanks for reading.

Using Powershell with the RingCentral API

I had the opportunity to work with the RingCentral API. I ended up with a series of functions that will handle the authorization/token information for you as long as you don’t mind using the Password authorization work flow. [When getting the API key](https://developer.ringcentral.com/library/getting-started.html). Make sure the “Platform Type” is set to Desktop (Mac/Windows/Other).

Code covered in this post can be found on pastebin or on my github
Quick Review of the Process and API:

RingCentral’s API is still pretty new, it looks like it has been around for about a year and half. There are some strict requirements needed before going to their production environment from the sandbox environment:
1) All Permissions requested must be used

2) A total of 20 calls must be made and each permission must be used at least once.

3) Out of all calls made over 48 hours no more than 5% can generate errors (getting throttled, bad calls, etc.)

4) It can take up to 7 days for your app to be approved.

Much of the detection is automatic, but it can take several hours for it to pickup your usage and errors. None of the requirements from the sandbox app to the production app are unreasonable but I found the process a bit clunky. Another interesting quirk is not all permissions for your app can be requested from the portal. You’ll need to contact email support for something like downloading call recordings. The email support is pretty responsive and I was pleased with that. I do wish the sandbox environment came with bogus data already populated. Rather than explore and get ideas what I can do with the API I kept spending time building fake data to resemble the production environment. Not horrible, but something that could be a bit more friendly.

The API documentation lacked any examples for Powershell. Though they didn’t have any examples, the documentation is generalized and written well enough that it wasn’t show stopping to figure out how to get a few examples done.

Before I go into the authorization scripts I want to focus on the global variables and configuration.

To get started:

$api_server_url = "https://platform.devtest.ringcentral.com"
$media_server_url = "https://media.devtest.ringcentral.com:443"
$username = ''
$password = ''
$extension = ''

#Base64 encoded strings as : Appkey:Appsecurity_key use this site: https://www.base64encode.org/ or build your own. it never changes so meh
$app_key = “”

$log_path = “C:\scripts\log\ringcentral.log”

All of the information can be found at https://service.devtest.ringcentral.com and https://developer.ringcentral.com/library/tutorials/get-started.html . The $app_key I didn’t bother to do dynamically since it is set and forget. You can go to most websites and get it encoded.
Make sure your $log_path is pointed to the log file.

You can test if everything is working by running get-authstatus.
If it returns true you’re ready to start running API calls. If you get false check the $log_path file and check your authorization information for typos.

If you want to use the auth functions in your own scripts you only really need a small chunk of code:

if(get-authstatus) {
$auth_token = $auth_obj.authorization
} else {
return $false
}

An example of a function making a call to the RingCentral API:

Function get-calllog() {
if(get-authstatus) {
$auth_token = $auth_obj.authorization
} else {
return $false
}
try {
$call = invoke-restmethod -uri "$api_server_url/restapi/v1.0/account/~/call-log" -Headers @{"Authorization" = "$auth_token"}
} catch {
return $false
}
return $call
}

The code can be found on pastebin or on my github

Besides the global calls the whole thing is 3 functions:

get-authstatus :

function get-authstatus() {

if(($auth_obj.expires_in -gt (get-date)) -and ($auth_obj.isset -eq $true)) {
return $true
} elseif(($auth_obj.expires_in -lt (get-date)) -and ($auth_obj.isset -eq $true) -and ($auth_object.refresh_token_expires_in -gt (get-date)) {

if(auth_refresh) {
echo “Token expired and refreshed successfully” >> $log_path
return $true
} else {
echo “Failed Token refresh” >> $log_path
return $false
}

} else {

if(auth_initiate) {
echo “Initializing Auth token”>> $log_path
return $true
}
}
}

This function checks that the initial authorization has been done, and none of the tokens have expired. If they have expired, it calls the correct function to initialize the authorization or to renew it.

auth_initiate:

function auth_initiate() {

#Authentication post data
$auth_post = @{
grant_type = ‘password’
username = $username
password = $password
extension = $extension
}

$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”
$headers.Add(“Authorization”, “Basic $app_key”)
$headers.Add(“Content-Type”, “application/x-www-form-urlencoded;charset=UTF-8”)

try {
$url = $api_server_url + “/restapi/oauth/token”
$auth_token = invoke-restmethod -Method Post -Uri $url -Body $auth_post -headers $headers -ContentType “application/x-www-form-urlencoded”
$authorization = $auth_token.token_type + ” ” +$auth_token.Access_token
} catch {
echo “Error refresh token: $_” >> $log_path
return $False
}

$Global:auth_obj = [PSCustomObject] @{
Isset = $true
authorization = $authorization
refresh_token = $auth_token.refresh_token
expires_in = (Get-date).Addseconds($auth_token.expires_in)
refresh_token_expires_in = (Get-date).Addseconds($auth_token.refresh_token_expires_in)
scope = $auth_token.scope
owner_id = $auth_token.owner_id
endpoint_id = $auth_token.endpoint_id
}

return $auth_obj
}

The above function creates the initial authorization token, and changes the time the token expires into the local system time. It does the same for the refresh token’s expiration as well. All of that information is populated into the authorization global object.

auth_refresh:

function auth_refresh() {
$refresh_post = @{
grant_type = 'refresh_token'
refresh_token = $auth_token.refresh_token
}

$url = $api_server_url + “/restapi/oauth/token”

$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”
$headers.Add(“Authorization”, “Basic $app_key”)
$headers.Add(“Content-Type”, “application/x-www-form-urlencoded;charset=UTF-8”)

try {
$auth_token = invoke-restmethod -Method Post -Uri $url -Body $refresh_post -headers $headers -ContentType “application/x-www-form-urlencoded”
$authorization = $auth_token.token_type + ” ” +$auth_token.Access_token
} catch {
echo “Error refresh token: $_” >> $log_path
return $false
}

$Global:auth_obj = [PSCustomObject] @{
Isset = $true
authorization = $authorization
refresh_token = $auth_token.refresh_token
expires_in = (Get-date).Addseconds($auth_token.expires_in)
refresh_token_expires_in = (Get-date).Addseconds($auth_token.refresh_token_expires_in)
scope = $auth_token.scope
owner_id = $auth_token.owner_id
endpoint_id = $auth_token.endpoint_id
}

return $auth_obj
}

If the token has expired it uses the refresh token to re-authorize the call.

That is it for today. Thanks for reading.

Scheduled Powerdown of Ec2 instances

After working on the Aws report I decided to start working on using AWS tags to control a script.

There are plenty of ways to stop and start Ec2 instances that don’t need tags. You can use Data pipe lines and do the same thing with an array and a scheduled tasks, but it wouldn’t auto-scale, etc.

This ended up being a fun way to mess around with Tags and reading tags. I would in no way recommend anyone use tags to the same extend I did to manage there EC2 instances. With great tools like Jenkins, and lamda functions there are far more sane way to go about this.

A simple tag detect and power off script looks a lot like my previous report script:

#
# AWS PowerOn/Off Schedular
# Tag Name: Power_Options
# Toggle power on and off script.
#
# Requires AWS plugin: https://aws.amazon.com/powershell/
# Set-AWSCredentials -AccessKey -SecretKey -Storeas # Full instructions found http://i-script-stuff.electric-horizons.com/
#
#

#profiles to check
$profile_list = (“Example_1”)

#Just a quick check for powershell 3.0 and older changes the method of plugin loading.
if($PSVersionTable.PSVersion.Major -ge 4) {
Import-Module AWSPowerShell
} else {
Import-Module “C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1”
}

#Actual Engine.
#Parses through profiles and regions.
foreach($profile in $profile_list) {
Set-AWSCredentials -ProfileName $profile
$region_list = Get-AWSRegion | select -expandproperty Region

foreach($region in $region_list) {
$Instance_list = Get-EC2Instance -region $region |select -expandproperty instances

$VPC_list = Get-EC2Vpc -Region $region
foreach ($VPC in $VPC_list) {
$Instance_list | Where-Object {$_.VpcId -eq $VPC.VpcId} | foreach-object {
$Instance_name = ($_.Tags | Where-Object {$_.Key -eq ‘Name’}).Value
$power_Action = $NULL
if($Power_Options = ($_.Tags | Where-Object {$_.Key -eq ‘Power_Options’}).Value) {
if($_.State.Name -like “running”) {
stop-EC2Instance -InstanceId $_.InstanceId -Region $region
} else {
Start-EC2Instance -InstanceId $_.InstanceId -Region $region
}
}
}
}
}
}

The above script can be found on Paste Bin or on my github

But with 256 characters to use in a tag you can do much better than turn an already on an instance off or an off instance on. I got a bit carried away and built a multi-delimited tag system based around powering on and off systems and notifying users. The tag system accepts abbreviations of days “Mon, Tue, Wed, Thu, Fri, Sat, Sun” Hours entered in 24 hour format 6,7,8 etc for AM. and 18,19,20 etc for PM. This creates a pretty horrific tag for a human to read. A Monday through Friday Power on and off Schedule with emails sent for each action looks like this:
PowerOn,Mon:5,Tue:5,Wed:6,Thu:6,Fri:6;PowerOff,Mon:18,Tue:18,Wed:22,Thu:18,Fri:18;Notify,user_example@example.com,user_example2@example.com

The format being {command}{coma}{scheduled day}{colon}{scheduled hour}{coma}{semi colon to end command}. The commands can repeat for multiple shut downs etc. I added a few “short cut” times in order to make the schedule readable by humans “weekend” checks Saturdays and Sundays only, Weekday checks for Monday through Friday, and Allweek is everyday.

The difficulty in reading isn’t really an issue as I plan on this format being generated and read by other scripts, as part of a larger management system. That system will be used to notify a team or person they are going to shutdown a vm or a group of Vms that do not need to be used after work hours. The notification will also provide a link that allows the shutdown to be canceled for the day, or for the Vms to be powered back on. I’ll do that section in PHP. The PHP will largely be a front end/user input setup.

The main driver for this is EC2 instances cost money while they are running, so shutting down unneeded instance for a few hours a day can add up. Some examples that came to mind for use were: a call center open from 6 am to 6 pm may not need a reporting server online from 7:00 pm till 5 am. Maybe a company doesn’t need a Staging or POC environment running over the weekend.

The larger segment of script using the tags can be found on paste bin or on my github
I don’t like how out of control scheduling got with the tag delimiter I wrote; though I would still use tags in this manner for things like adding monitoring. An example that comes to mind would be using Ichinga2 monitor with the groups listed such as: Ichinga2:LAMP_standard,Autobouncer add those tags to an autoscaling group to add the correct tags to my ec2 instances and Ichinga can grow and shrink with the environment without human intervention.

Well that is it for today. I’ll post again soon with the completion of my notifying/canceling shutdown system.

AWS Report using Powershell

I have been using AWS off and on for almost a year. Both for professional projects and for personal projects. A great service and if your project is small enough a very cost effective hosting solution. I decided to start working on automating a few tasks using Powershell and the AWS Powershell module.

I wrote a quick reporting script to run as a scheduled task to keep track of my personal projects. The report can check multiple AWS accounts. I refer to these as environments in the report. The script generates an html and csv file . The code can be found in my github or on pastebin (pastebin)

To configure the script for your environment:

Download the plugin here and install it.

I’d suggest creating a read only user for each environment you need to check. Creating a user is pretty well documented.

When prompted download the Access Keys or Copy and paste them into notepad we’ll need those to authenticate against aws.

After the user is created make a group called “Read only” from the permissions list provide it with the “ReadOnlyAccess” policy:

 

 

Now that the AWS account is created and you noted or recreated your Access keys. Log into the Computer/Server you installed the plugin on as the Service account user that will be running the script. Import the AWS module:

if($PSVersionTable.PSVersion.Major -ge 4) {

Import-Module AWSPowerShell

} else {

Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"

}

Take the Access keys for each environment and enter them into the powershell like so:

Set-AWSCredentials -AccessKey <access key> -SecretKey <secret key> -Storeas <profile name>

For this script the <profile name> will be reported as the environment so names like: “Prod, Staging, Dev, Corp” what ever works for the environment.

At the top of the script enter each of the profile names in to the Array:


#profiles to check

$profile_list = (“Example_1”)

Check the path configs and make sure everything is correct for your environment:

#Path configs

$website_path = "C:\inetpub\awsreport.html"

$website_tmp_path = ".\awsreport.tmp"

$csv_tmp_path = ".\awscsvreport.tmp"

$csv_path = "C:\inetpub\awsreport.csv"

Create a scheduled task to run as often as you would like updates.

Thanks for reading. If you have questions feel free to leave a comment.

PHP, Powershell and Shell_Exec()

Over in /r/powershell I have been seeing an increase in posts about using PHP as a front end. The general consensus has been to use shell_exec to launch the script and pass the variables to powershell. Looking around online I haven’t seen any of the tutorials address the security concerns that come with shell_exec.

In this post I’ll show an example of an attack done on shell_exec as it relates to launching powershell. Before we start a quick note: All of the mitigation techniques I am going to show you should be part of a security profile. Some examples of things to consider when deciding on your security profile:
1) Proper delegation of Service account permissions in active directory and on the local system.
2) Installation of tools such as Mod_security
3) Installation of Anti-virus
4) Limiting access of the PHP interface to only authorized users.

The Attack:

First we’ll need a php page that launches a powershell script. A txt copy of the code for the examples can be found here in pastebin or on my github Pictures were used because wordpress formatting was driving me nuts.
PHP
index_php_basic

Powershell
powershell_basic

Ideally a user enters data into the php form. The data is logged and returned to the user.

For this example I submitted “test123”. The text was logged in log.txt and the expected response was sent back:

output_expected_basic

Now lets send the attack string to the form:

test11;” dir c:\ >> C:\inetpub\wwwroot\dir_list.txt

This time the response isn’t exactly what we expected. The webpage output the same response but 1/2 of the attack string is missing:

output_unexpected_basic

The log file is also only showing test11. If we go to C:\inetpub\wwwroot\dir_list.txt or simply download it from the root of our webserver we get a directory listing like so:

Directory: C:\

Mode LastWriteTime Length Name
—- ————- —— —-
d—– 6/7/2016 9:54 PM inetpub
d—– 8/22/2013 8:52 AM PerfLogs
d-r— 5/27/2016 7:31 AM Program Files
d—– 5/27/2016 7:31 AM Program Files (x86)
d—– 6/4/2016 8:44 PM Scripts
d—– 5/21/2016 6:51 PM tmp
d-r— 5/24/2016 11:55 PM Users
d—– 5/21/2016 6:34 PM Windows

This is a very basic proof of concept. The attacks can become much more complex leading to a fully compromised Web server or a compromised Active Directory instance if the Service Account is a domain admin, etc.

Mitigating the attacks:
One of the most common defenses, but hardest to do correctly, is to use a combination of regular expressions, escapeshellcmd, and escapeshellarg.
An example like this might work updating a title, firstname or last name:

preg_match_index_php

Code found here on pastebin or the github still contains everything

A better solution would be to submit your information to Mysql, or a file format of your choosing ( XML, Json, CSV) on the local drive. Then use shell_Exec to launch the powershell script without any direct user input. Let the powershell script parse the data and submit it. You can also use a scheduled task to launch this method which has the added advantage of not letting the Service Account user anywhere near IIS.

In some cases you need to allow the full range of characters and absolutely must take user input and the data can not be allowed to rest in a file or mysql table. For these rare cases I would suggest encoding the data. In this example Base64 is used:
Php:
base_64_php

Powershell:
base_64_powershell

Text version found on pastebin or the github still contains everything
When we submit the attack code again: test11;” dir c:\ >> C:\inetpub\wwwroot\dir_list.txt
We see the full attack returned as well as logged:
attack_Return_base64

Base64 encoding can be a bit system heavy and probably wouldn’t work for a bigger site.
I hope some part of this has proven to be informative. All code examples can be found here: https://github.com/ryanlangley4/ps_and_php_sec

*None of the examples done here limit the length of user input, or any number of other things like accept spaces when sending to powershell. The goal is to focus on shell_exec and not get too bogged down with details.

Generate up to 5,000 (real looking) Active directory users for test labs

I have been spending time lately working on simulating AD migrations and investigating user experience. The general goal is to find and purposely cause issues during the migration and then create audits to find and mitigate the pain points.

In order to accomplish that I needed some real looking users in my environment. So I went ahead and created a script that utilizes the API found over at https://www.randomuser.me

A neat little API you can generate fake user information in blocks of up to 5,000. The full documentation can be found here.

The script is broken up into 2 functions.

The first function is useful on its own “find_ad_id”

function find_ad_id($first,$last) {
$first = $first -Replace "\s*"
$last = $last -Replace "\s*"
$not_found = $true
for($i = 1; $i -le $first.length; $i++) {
$Sam_account =""
$letters_first = ""

for($l = 0; $l -ne $i; $l++){
$letters_first += $first[$l]
}

$sam_account = $letters_first+$last
if(-not (Get-aduser -Filter {SamaccountName -eq $sam_account})) {
$not_found = $false
return $sam_account
}
}

if($not_found -eq $true) {
return "ERROR:FAIL"
}
}

find_ad_id is a function I wrote for my personal profile a while ago. Really pretty straightforward but I find it very useful. It takes a first and last name as an input and then strips all spaces out of the name. The next step is to take the firstname and break it up into initials. It tries combining the first name into the last name looking for an un-used Samaccountname value in AD.

For example:
John Doe
Jdoe is attempted, then jodoe until either a free sam account is found or an error is logged.

The second function is called “Generate”:

function generate() {
$character = @("!","$","%","^","&","*","(",")","?")
$letters_low=@("a","b","c","d","e","f","g","h","i","j","k","l","m","n","o","p","q","r","s","t","u","v","w","x","y","z")
$letters_cap=@("A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z")
$numbers=@("1","2","3","4","5","6","7","8","9","0")
$itterations = get-random -minimum 8 -maximum 17
[string]$pass_value = ""

for($i = 0; $i -ne $itterations+1;$i++) {
$character_type = Get-random -minimum 1 -maximum 9

switch ($character_type) {
1 { $letter = Get-random -minimum 0 -maximum 26
$pass_value = $pass_value+$character[$letter]
}

2 { $letter = Get-random -minimum 0 -maximum 26
$pass_value = $pass_value+$letters_low[$letter] }

3 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$letters_low[$letter] }

4 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$letters_cap[$letter] }

5 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$character[$letter] }

6 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$numbers[$letter] }

7 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$letters_cap[$letter] }

8 { $letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$letters_cap[$letter]}
}

$letter = Get-random -minimum 0 -maximum 26
$pass_value = $pass_value+$character[$letter]

$letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$letters_cap[$letter]

$letter = Get-random -minimum 0 -maximum 11
$pass_value = $pass_value+$numbers[$letter]
}
return $pass_value
}

Generate creates an 11 – 19 character password for each user. It first goes through and 8 – 16 iterations of randomly chosen Lowercase, Uppercase, Numbers or Symbols. Then it purposely appends a Symbol, A capital letter and a number. This of course limits what the end characters will be with the generator, but it does make sure the password always meets Active directory Minimal password requirements

The rest of the script simply handles data from the API, creates the user and logs the password.
If you have any questions feel free to leave a comment.

The script can be found:
Here on paste bin
OR
Here on github

This script is posted without warranty.