signinlca

signinlca

script to signin for volunteers at lca2015!

The script asks for input of firstname, lastname, tshirt size, amount of coffee volc and comments. It creates a python dict with this data along with the current date and hour. It gets the username of user and saves the data in the users home dir under the folder signinlca. It saves the data as a json object. Currently saves the file as firstname + lastname.

How could this be improved?

Signup/Signin System.

For signup - username, firstname, lastname, password(x2) is collected with input. Password is salted and hashed. Username, firstname, lastname, and password (salt/hash) is added to a dict. Dict is converted to a json object. json object is saved as a json file in the folder signinlca / USERNAME-FOLDER / .signup.json

For signin. Username is collected with input. Looks for folder of username. Opens .signup.json file - parsing data. Save the value of 'password' as a varible.

Asks for password (getpass.getpass('Password please: ') salt/hash this password. save password attempt if error, otherwise true complete signin.

TODO

add option to choose to login or logout. Y/N option for each one.

add logout script that appends to the login data. saves, time/date/comment. anything else?

Asign to jobs/room?

Graph up total hour worked in day/week

scp/rsync data to server/web page.

Make new account, use existing account.

Database of existing accounts... static page of files.

Add password to account

If you signin, doesn't ask if want to sign out.

If you signout, doesn't ask if you want to sign in.

Hash passwords

When creating account asked for username (which could be firstname + lastname), and password. Passwords are hashed and when user tries to login the password inputed is compared to the hashed password.

Save that hash as a varible that is then complared with the saved hash password.

I have their signin data. Now what to do with it? Save it as a json object to be then used when they signin later?

More security on it? Hash their usernames, firstnames, 2nd password?

In [52]:
import os
#import time
import json
import getpass
import arrow
import hashlib
from passlib.hash import pbkdf2_sha256
from walkdir import filtered_walk, dir_paths, all_paths, file_paths
In [53]:
gmtz = arrow.utcnow()
In [54]:
yrmt = gmtz.strftime("%Y")
mthza = gmtz.strftime("%m")
dthaq = gmtz.strftime("%d")
gmtz.strftime("%Y")
#yearz = strftime("%y", gmtime())
#monthz = strftime("%m", gmtime())
#dayz = strftime("%d", gmtime())
Out[54]:
'2015'
In [55]:
yrmt
Out[55]:
'2015'
In [56]:
mthza
Out[56]:
'02'
In [57]:
dthaq
Out[57]:
'16'
In [58]:
def returndate():
    return (dthaq + '-' + mthza + '-' + yrmt)

def returntime():
    return gmtz.strftime('%H:%M:%S')

puser = ('wcmckee')

yrnum = gmtz.strftime("%Y")
mnthnum = gmtz.strftime("%m")
dayzum = gmtz.strftime("%d")

signpath = ('/home/' + puser + '/signinlca')
yrpath = (signpath + '/' + yrnum)
mnthpath = (yrpath + '/' + mnthnum)
dayzpath = (mnthpath + '/' + dayzum)
In [59]:
if os.path.isdir(signpath) == True:
    print 'Path is there'
else:
    print 'Path not there'
    os.mkdir(signpath)
Path is there
In [60]:
if os.path.isdir(yrpath) == True:
    print 'Year Path is there'
else:
    print 'Year Path not there'
    os.mkdir(yrpath)
    
if os.path.isdir(mnthpath) == True:
    print 'Month Path is there'
else:
    print 'Month Path not there'
    os.mkdir(mnthpath)
    
if os.path.isdir(dayzpath) == True:
    print 'Day Path is there'
else:
    print 'Day Path not there'
    os.mkdir(dayzpath)
Year Path is there
Month Path is there
Day Path is there
In [61]:
dayzpath
Out[61]:
'/home/wcmckee/signinlca/2015/02/16'
In [62]:
os.chdir(dayzpath)
In [63]:
opsign = open('/home/wcmckee/signinlca/index.json', 'w')
In [77]:
signup = raw_input('signup y/n ')
signupd = dict()

numchez = 0

if 'y' in signup:
    print('Welcome to signup!')
    firnam = raw_input('firstname: ')
    signupd.update({"firstname":firnam, })
    lasnam = raw_input('last name: ')
    
    usenam = raw_input('username: ')
    emnam = raw_input('email: ')
    
    
    os.mkdir('/home/wcmckee/signinlca/usernames/' + usenam) 
    #passworz = passwd()
    
    pastest = getpass.getpass('password: ')

    pasnde = getpass.getpass('enter password again: ')
    
    signupd.update({"firstname":firnam, "lastname":lasnam,
                "username":usenam})
    
    hashez = pbkdf2_sha256.encrypt(pastest, rounds=200000, salt_size=16)
    emhash = pbkdf2_sha256.encrypt(emnam, rounds=200000, salt_size=16)
    
    signupd.update({"password":hashez, "email":emhash})
    
    savjsn = open('/home/wcmckee/signinlca/usernames/' + usenam + '/.signups.json', 'a')
    jsncov = json.dumps(signupd)
    savjsn.write(jsncov)
    savjsn.close()
    usdir = ('useradd -p ' + pastest + ' ' + usenam)
    os.system(usdir)
    print('Signup Complete. You can now signin with the username and password')

for logy in range(12):
    ferzr = (numchez)
    numchez = (numchez + 10)
    #usfaz = ('/home/wcmckee/signinlca/usernames/' + str(numchez) + usenam + '/index.json', 'w')
    os.mkdir('/home/wcmckee/signinlca/usernames/' + str(usenam) + '/' + str(logy))
signup y/n y
Welcome to signup!
firstname: will
last name: mckee
username: figlet
email: will@test.com
password: ········
enter password again: ········
Signup Complete. You can now signin with the username and password
In [65]:
#hashez = pbkdf2_sha256.encrypt(pastest, rounds=200000, salt_size=16)
#signupd.update({"password":hashez})
#signin. need to open 
print ('signin!')
loginam = raw_input('Username: ')
#Open logins.json, find the username json object
loginpas = getpass.getpass('Password: ')
vercryp = pbkdf2_sha256.verify(loginpas, hashez)
if vercryp == True:
    print 'passwords correct - Logged in!'
    
else:
    print 'passwords wrong - Could not log!'
    #exit
signin!
Username: point
Password: ········
passwords correct - Logged in!
In [66]:
type(signupd)
Out[66]:
dict
In [66]:
 
In [67]:
#savjsn.write(jsncov)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-67-a899169d95c8> in <module>()
----> 1 savjsn.write(jsncov)

ValueError: I/O operation on closed file
In [17]:
#savjsn.close()
In [19]:
dicsigni = dict()
In [21]:
signin = raw_input('signin? y/n')

if 'y' in signin:
    #uzname = raw_input('firstname: ')
    #lzname = raw_input('lastname: ')
    uzernam = raw_input('username: ')
    
    dicsigni.update({'username': uzernam})
    opsignin = open('/home/wcmckee/signinlca/usernames/' + str(uzernam) + ('/') + ('.signin.json'), 'w') 

    logtest = getpass.getpass('login password: ')
    loghash = pbkdf2_sha256.encrypt(logtest, rounds=200000, salt_size=16)
    vercryp = pbkdf2_sha256.verify(logtest, hashez)
    dicsigni.update({'password':loghash})
                    
    dicjsn = json.dumps(dicsigni)
    
    opsignin.write(dicjsn)
    opsignin.close()
    
                    
    #opsignin.write

    if pastest == True:
        print 'passwords correct'
signin? y/ny
username: wcm
login password: ········
In [24]:
ersignin = open('/home/wcmckee/signinlca/usernames/' + str(uzernam) + ('/') + ('.signin.json'), 'r') 

paswz = ersignin.read()
In [28]:
dicvert = json.loads(paswz)
In [49]:
dicloin = dicvert['password']
In [39]:
tresignin = open('/home/wcmckee/signinlca/usernames/' + str(uzernam) + ('/') + ('.signups.json'), 'r')
In [40]:
convea =  tresignin.read()
In [43]:
jsnver = json.loads(convea)
In [47]:
jpas = jsnver['password']
In [50]:
jpas
Out[50]:
u'$pbkdf2-sha256$200000$utc6h9A6B.CcsxZirHVurQ$rHAHVhTN5pwOgBTj.6un2605pM.xZ8Kbu7wYRqPcFGo'
In [51]:
dicloin
Out[51]:
u'$pbkdf2-sha256$200000$tTZmrJVyTinFmHPOWYtxjg$esTV1MEbmqXDIxc4ZHUFdThYpTZdk3.2101Ndga0mes'
In [118]:
loginz = raw_input('signin y/n ')
if 'y' in loginz:
    print('You signed in')
    #logoutz = None
else:
    logoutz = raw_input('signouts y/n ')
signin y/n y
You signed in
In [119]:
if 'y' in loginz:
    firnam = raw_input('first name: ')
    lasnam = raw_input('last name: ')
    tshir = raw_input('tshirt size: ')
    cofvol = raw_input('coffee volc: ')
    comen = raw_input('comments: ')
    betdict = dict()
    betdict.update({'first-name' : firnam, 'last-name' : lasnam, 'signin-date' : returndate()})
    betdict.update({'signin-hrmin' : returntime()})
    betdict.update({'tshirt-size' : tshir})
    betdict.update({'coffees' : int(cofvol)})
    betdict.update({'comments:' : comen})
    convj = json.dumps(betdict)
    puser = getpass.getuser()
    opday = open((dayzpath + '/' + firnam + lasnam) + '.json', 'w')
    opday.write(str(convj))
    opday.close()
else:
    print ('not signing in')
first name: w
last name: mcke
tshirt size: L
coffee volc: 4
comments: test
In [480]:
if 'y' in logoutz:
    comout = raw_input('out comments: ')
    outdic = dict()
    
    firnaz = raw_input('first name: ' )
    lasnaz = raw_input('last name: ')
    outdic.update({'signout-date': returndate()})
    outdic.update({'signout-time': returntime()})
    outdic.update({'signout-comment': comout})
    conout = json.dumps(outdic)
    signoutz = open((dayzpath + '/' + firnaz + lasnaz) + '.json', 'a')
    signoutz.write(str(conout))
    signoutz.close()
else:
    print ('not signing out')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-480-427a1c2429f6> in <module>()
----> 1 if 'y' in logoutz:
      2     comout = raw_input('out comments: ')
      3     outdic = dict()
      4 
      5     firnaz = raw_input('first name: ' )

TypeError: argument of type 'NoneType' is not iterable
In [481]:
os.listdir(dayzpath)
Out[481]:
['williammckee.json', 'helentesting.json']
In [481]:
 
In [68]:
files = file_paths(filtered_walk('/home/wcmckee/signinlca/', depth=100, included_files=['*.json']))
In [69]:
for fie in files:
    #print fie
    print fie
/home/wcmckee/signinlca/index.json
/home/wcmckee/signinlca/signups.json
/home/wcmckee/signinlca/2015/02/15/williammckee.json
/home/wcmckee/signinlca/2015/02/15/holleymckee.json
/home/wcmckee/signinlca/2015/02/15/wmcke.json
/home/wcmckee/signinlca/2015/Feb/07/williammckee.json
/home/wcmckee/signinlca/2015/Feb/07/blahmc.json
/home/wcmckee/signinlca/2015/Feb/13/williammckee.json
/home/wcmckee/signinlca/2015/Feb/15/williammckee.json
/home/wcmckee/signinlca/2015/Feb/14/willmckee.json
/home/wcmckee/signinlca/usernames/wcm/.signups.json
/home/wcmckee/signinlca/usernames/wcm/.signin.json
/home/wcmckee/signinlca/usernames/wmen/.signups.json
/home/wcmckee/signinlca/usernames/webmck/.signups.json
/home/wcmckee/signinlca/usernames/pjohns/.signups.json
/home/wcmckee/signinlca/usernames/checkthis/.signups.json
/home/wcmckee/signinlca/usernames/jchick/.signups.json
/home/wcmckee/signinlca/usernames/poi/.signups.json
/home/wcmckee/signinlca/usernames/qwe/.signups.json
/home/wcmckee/signinlca/usernames/point/.signups.json
/home/wcmckee/signinlca/usernames/cvb/.signups.json
/home/wcmckee/signinlca/usernames/gerty/.signups.json
/home/wcmckee/signinlca/usernames/gerty/.signin.json
/home/wcmckee/signinlca/usernames/jblog/.signups.json
/home/wcmckee/signinlca/usernames/ssung/.signups.json
/home/wcmckee/signinlca/usernames/clittle/.signups.json
In [72]:
uslis = os.listdir('/home/wcmckee/signinlca/usernames/')
In [74]:
print ('User List: ')
for usl in uslis:
    print usl
User List: 
wcm
wmck
wmen
webmck
pjohns
checkthis
jchick
poi
qwe
point
cvb
gerty
jblog
ssung
clittle
In [ ]:
 

pyguessgame

In [65]:
import random
import urwid
import os
from pyfiglet import Figlet
import json
import random
import clint

Started as a simple python guessing game to be ran in terminal. Asks for a number between 0-10 and works up to 100 - asking for the next 10 each time. On win a urwid screen shows with 'You Win'. It would be nice if the whole game was inside of urwid. Figlet intergration.

In [39]:
f = Figlet()
In [40]:
opusr = os.listdir('/home/wcmckee/signinlca/usernames/')
In [40]:
 
In [41]:
opusr
Out[41]:
['tnow',
 'signinlca.py',
 'charle',
 'wcm',
 'wmck',
 'wmen',
 'webmck',
 'pjohns',
 'red',
 'wez',
 'checkthis',
 'blah',
 'jchick',
 'poiu',
 'poi',
 'qwe',
 'point',
 'cvb',
 'blag',
 'gerty',
 'jblog',
 'ssung',
 'clittle',
 'yellow']
In [42]:
ranumz = len(opusr)
In [43]:
ranin = random.randint(0, ranumz)
In [44]:
ranin
Out[44]:
15
In [45]:
opza =  opusr[ranin]
In [46]:
opza
Out[46]:
'qwe'
In [47]:
lidte = open('/home/wcmckee/signinlca/usernames/' + opza + '/.signups.json', 'r')
In [48]:
plid = lidte.read()
In [51]:
tlid = json.loads(plid)
In [64]:
firna = tlid['firstname']
In [ ]:
 
In [60]:
tlicha = tlid['lastname']
In [66]:
fullna = firna + ' ' + tlicha
In [67]:
from clint.textui import colored, puts

perf = puts(colored.yellow(fullna))

perf
qwe qwe
In [63]:
print f.renderText(opza)
print fullna
                    
  __ ___      _____ 
 / _` \ \ /\ / / _ \
| (_| |\ V  V /  __/
 \__, | \_/\_/ \___|
    |_|             

qwe qwe
In [ ]:
 
In [38]:
def exitq(key):
    if key in ('enter', 'return'):
        raise urwid.ExitMainLoop()
In [39]:
pallette = [
    ('banner', 'dark red', 'white'),
    ('streak', 'dark red', 'white'),
    ('bg', 'dark red', 'white'),]
In [40]:
numchez = 0
In [48]:
for guesz in range(10):
    farchez = (numchez)
    numchez = (numchez + 10)
    
    def GetNum():
        return random.randint(farchez,numchez)
    
    randnum = GetNum()
    
    lownumz = (numchez)
    
    innumz = ('Enter a number between ' + str(farchez) + ' and ' + str(lownumz) + ': ')
    
    guessnum = raw_input(innumz)
    
    guesintz = int(guessnum)
    
    print ('Guess was: ' + str(guesintz))
    #colomod
    print ('Correct was: ' + str(randnum))
    
    if guesintz == randnum:
        txt = urwid.Text(f.renderText( guessnum + ' ' + str(randnum) + ' You Win!'))
        numpor = urwid.Text(f.renderText(guessnum))
        map1 = urwid.AttrMap(txt, 'streak')
        mep = urwid.AttrMap(numpor, 'streak')
        fil = urwid.Filler(map1)
        fel = urwid.Filler(mep)
        map2 = urwid.AttrMap(fil, 'bg')
        loopa = urwid.AttrMap(fel, 'bg')
        looena = urwid.MainLoop(loopa, pallette, unhandled_input=exitq)
        looena.run()
        loop = urwid.MainLoop(fil, pallette, unhandled_input=exitq)
        loop.run()
        print f.renderText(guessnum + ' ' + str(randnum) + ' You Win!')
    else:
        txt = urwid.Text(f.renderText(guessnum + ' ' + str(randnum) + ' You Lose!'))
        map1 = urwid.AttrMap(txt, 'streak')
        fil = urwid.Filler(map1)
        map2 = urwid.AttrMap(fil, 'bg')
        loop = urwid.MainLoop(fil, pallette, unhandled_input=exitq)
        loop.run()
        print f.renderText(guessnum + ' ' + str(randnum) + ' You lose!')
        
Enter a number between 70 and 80: 
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-48-ff8888390e77> in <module>()
     14     guessnum = raw_input(innumz)
     15 
---> 16     guesintz = int(guessnum)
     17 
     18     print ('Guess was: ' + str(guesintz))

ValueError: invalid literal for int() with base 10: ''
In [35]:
txt = urwid.Text(f.renderText(guessnum + ' ' + str(randnum)))
#numpor = urwid.Text(f.renderText(guessnum))
map1 = urwid.AttrMap(txt, 'streak')
#mep = urwid.AttrMap(numpor, 'streak')
fil = urwid.Filler(map1)
#fel = urwid.Filler(mep)
map2 = urwid.AttrMap(fil, 'bg')
#loopa = urwid.AttrMap(fel, 'bg')
#looena = urwid.MainLoop(loopa, pallette, unhandled_input=exitq)
#looena.run()
loop = urwid.MainLoop(fil, pallette, unhandled_input=exitq)
loop.run()
7
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-35-0a8c9e47ba8d> in <module>()
     10 #looena.run()
     11 loop = urwid.MainLoop(fil, pallette, unhandled_input=exitq)
---> 12 loop.run()
     13 print f.renderText('You Win!')

/usr/local/lib/python2.7/dist-packages/urwid/main_loop.pyc in run(self)
    276         """
    277         try:
--> 278             self._run()
    279         except ExitMainLoop:
    280             pass

/usr/local/lib/python2.7/dist-packages/urwid/main_loop.pyc in _run(self)
    373                 self.screen.stop()
    374 
--> 375         self.event_loop.run()
    376         self.stop()
    377 

/usr/local/lib/python2.7/dist-packages/urwid/main_loop.pyc in run(self)
    676             while True:
    677                 try:
--> 678                     self._loop()
    679                 except select.error as e:
    680                     if e.args[0] != 4:

/usr/local/lib/python2.7/dist-packages/urwid/main_loop.pyc in _loop(self)
    713 
    714         for fd in ready:
--> 715             self._watch_files[fd]()
    716             self._did_something = True
    717 

/usr/local/lib/python2.7/dist-packages/urwid/raw_display.pyc in <lambda>()
    390         else:
    391             wrapper = lambda: self.parse_input(
--> 392                 event_loop, callback, self.get_available_raw_input())
    393         fds = self.get_input_descriptors()
    394         handles = []

/usr/local/lib/python2.7/dist-packages/urwid/raw_display.pyc in get_available_raw_input(self)
    424         implementation; you can safely ignore it if you implement your own.
    425         """
--> 426         codes = self._get_gpm_codes() + self._get_keyboard_codes()
    427 
    428         if self._partial_codes:

/usr/local/lib/python2.7/dist-packages/urwid/raw_display.pyc in _get_keyboard_codes(self)
    498         codes = []
    499         while True:
--> 500             code = self._getch_nodelay()
    501             if code < 0:
    502                 break

/usr/local/lib/python2.7/dist-packages/urwid/raw_display.pyc in _getch_nodelay(self)
    632 
    633     def _getch_nodelay(self):
--> 634         return self._getch(0)
    635 
    636 

/usr/local/lib/python2.7/dist-packages/urwid/raw_display.pyc in _getch(self, timeout)
    542                 self.gpm_event_pending = True
    543         if self._term_input_file.fileno() in ready:
--> 544             return ord(os.read(self._term_input_file.fileno(), 1))
    545         return -1
    546 

TypeError: ord() expected a character, but string of length 0 found
In [ ]:
print 'The End'

tarpipe

In [1]:
#%%bash
#tar zcvf - /home/public/cam | ssh pi@10.1.1.14 "cat > /home/pi/sativacam.tar.gz"
#sudo rm /home/public/cam/*

TarPipe

This script creates a .tar.gz of certain folders and allows for easy search of the files backed up.

Currently focus on on motion but getsdrawn would also help.

TODO

ssh and tarpipe into redditgetsdrawn when it hits midnight on server time (GMT).

In [2]:
import tarfile
import time
import os
import getpass
import paramiko
import arrow
curtime = time.strftime("%d-%b-%Y-%H", time.gmtime())
In [3]:
sshgetdrn = paramiko.SSHClient()
In [4]:
sshgetdrn.set_missing_host_key_policy(paramiko.AutoAddPolicy())
In [5]:
usrg = getpass.getuser()
In [6]:
sshgetdrn.connect('128.199.60.12', username='wcmckee')
In [7]:
sup = sshgetdrn.exec_command('uptime')
In [8]:
ouput = sshgetdrn.invoke_shell()
In [9]:
ouput.recv(1000)
Out[9]:
'Last login: Tue Feb  3 19:17:33 2015 from 203-97-200-144.dsl.clear.net.nz\r\r\n'
In [10]:
ouput.send('uptime')
Out[10]:
6
In [11]:
ouput.send('chdir /getsdrawn')
Out[11]:
16
In [12]:
opftp = sshgetdrn.open_sftp()
In [13]:
for i in opftp.listdir('/home/wcmckee/getsdrawndotcom/'):
    print i
style.css
googlebb8fa72eb382e061.html
imgs
test.txt
sitemap.xml
index.html
In [14]:
utc = arrow.utcnow()

I want datetime printed as year-month-day-hour-min.

Currently shows on getsdrawn as : Updated Tue, 03 Feb 2015 19:00:12 +0000.

Folder format:

In [ ]:
 
In [71]:
print utc
2015-02-03T19:25:05.046523+00:00
In [73]:
utc.weekday()
Out[73]:
1
In [76]:
utc.date()
Out[76]:
datetime.date(2015, 2, 3)
In [78]:
utc.datetime
Out[78]:
datetime.datetime(2015, 2, 3, 19, 25, 5, 46523, tzinfo=tzutc())
In [8]:
backdir = ('/home/' + usrg + '/backup-motion/')
In [10]:
if os.path.isdir(backdir) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(backdir)
its false
In [11]:
tar = tarfile.open(backdir + curtime + ".tar.gz", "w:gz")
tar.add("/home/wcmckee/mot/", arcname="TarName")
tar.close()
#os.rmdir('/home/shared/cam')
In [2]:
 
In [ ]:
 

GetsDrawnDotCom

GetsDrawn DotCom

This is a python script to generate the website GetsDrawn. It takes data from /r/RedditGetsDrawn and makes something awesome.

The script has envolved and been rewritten several times.

The first script for rgdsnatch was written after I got banned from posting my artwork on /r/RedditGetsDrawn. The plan was to create a new site that displayed stuff from /r/RedditGetsDrawn.

Currently it gets the most recent 25 items on redditgetsdrawn, and saves it to a folder. The script looks at the newest 25 reference photos on RedditGetsDrawn. It focuses only on jpeg/png images and ignores and links to none .jpg or .png ending files. It is needed to instead of ignoring them files - get the image or images in some cases, from the link. The photos are always submitted from imgur. Still filter out the i.imgur files, but take the links and filter them through a python imgur module returning the .jpeg or .png files.

This is moving forward from rgdsnatch.py because I am stuck on it.

TODO

Fix the links that don't link to png/jpeg and link to webaddress. Needs to get the images that are at that web address and embed them.

Display artwork submitted under the images.

Upload artwork to user. Sends them a message on redditgetsdrawn with links.

More pandas

Saves reference images to imgs/year/month/day/reference/username-reference.png

Saves art images to imgs/year/month/day/art/username-line-bw-colour.png

Creates index.html file with: Title of site and logo: GetsDrawn Last updated date and time.

Path of image file /imgs/year/month/day/username-reference.png. (This needs changed to just their username).

Save off .meta data from reddit of each photo, saving it to reference folder. username-yrmnthday.meta - contains info such as author, title, upvotes, downvotes. Currently saving .meta files to a meta folder - along side art and reference.

Folder sorting system of files. websitename/index.html-style.css-imgs/YEAR(15)-MONTH(2)-DAY(4)/art-reference-meta Inside art folder Currently it generates USERNAME-line/bw/colour.png 50/50 white files. Maybe should be getting art replies from reddit?

Inside reference folder Reference fold is working decent. it creates USERNAME-reference.png / jpeg files.

Currently saves username-line-bw-colour.png to imgs folder. Instead get it to save to imgs/year/month/day/usernames.png. Script checks the year/month/day and if folder isnt created, it creates it. If folder is there, exit. Maybe get the reference image and save it with the line/bw/color.pngs

The script now filters the jpeg and png image and skips links to imgur pages. This needs to be fixed by getting the images from the imgur pages. It renames the image files to the redditor username followed by a -reference tag (and ending with png of course). It opens these files up with PIL and checks the sizes. It needs to resize the images that are larger than 800px to 800px. These images need to be linked in the index.html instead of the imgur altenatives.

Instead of the jpeg/png files on imgur they are downloaded to the server with this script.

Filter through as images are getting downloaded and if it has been less than certain time or if the image has been submitted before

Extending the subreddits it gets data from to cycle though a list, run script though list of subreddits.

Browse certain days - Current day by default but option to scroll through other days.

Filters - male/female/animals/couples etc Function that returns only male portraits. tags to add to photos. Filter images with tags

In [2]:
import os 
import requests
from bs4 import BeautifulSoup
import re
import json
import time
import praw
import dominate
from dominate.tags import * 
from time import gmtime, strftime
#import nose
#import unittest
import numpy as np
import pandas as pd
from pandas import *
from PIL import Image
from pprint import pprint
#import pyttsx
import shutil
import getpass
In [3]:
hosnam = getpass.getuser()
In [4]:
gtsdrndir = ('/home/' + hosnam + '/getsdrawndotcom/')
In [5]:
gtsdrndir
Out[5]:
'/home/wcmckee/getsdrawndotcom/'
In [7]:
if os.path.isdir(gtsdrndir) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(gtsdrndir)
its false
In [9]:
os.chdir(gtsdrndir)
In [10]:
r = praw.Reddit(user_agent='getsdrawndotcom')
In [11]:
#getmin = r.get_redditor('itwillbemine')
In [12]:
#mincom = getmin.get_comments()
In [13]:
#engine = pyttsx.init()

#engine.say('The quick brown fox jumped over the lazy dog.')
#engine.runAndWait()
In [14]:
#shtweet = []
In [15]:
#for mi in mincom:
#    print mi
#    shtweet.append(mi)
In [16]:
bodycom = []
bodyicv = dict()
In [17]:
#beginz = pyttsx.init()
In [18]:
#for shtz in shtweet:
#    print shtz.downs
#    print shtz.ups
#    print shtz.body
#    print shtz.replies
    #beginz.say(shtz.author)
    #beginz.say(shtz.body)
    #beginz.runAndWait()
    
#    bodycom.append(shtz.body)
    #bodyic
In [19]:
#bodycom 
In [20]:
getnewr = r.get_subreddit('redditgetsdrawn')
In [21]:
rdnew = getnewr.get_new()
In [22]:
lisrgc = []
lisauth = []
In [23]:
for uz in rdnew:
    #print uz
    lisrgc.append(uz)
In [24]:
gtdrndic = dict()
In [30]:
imgdir = (gtsdrndir + 'imgs')
In [35]:
imgdir
Out[35]:
'/home/wcmckee/getsdrawndotcom/imgs'
In [37]:
if os.path.isdir(imgdir) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(imgdir)
its false
In [38]:
artlist = os.listdir(imgdir)
In [39]:
from time import time
In [40]:
yearz = strftime("%y", gmtime())
monthz = strftime("%m", gmtime())
dayz = strftime("%d", gmtime())


#strftime("%y %m %d", gmtime())
In [53]:
yrzpat = (imgdir + yearz)
monzpath = (yrzpat + '/' + monthz)
dayzpath = (monzpath + '/' + dayz)
rmgzdays = (dayzpath + '/reference')
imgzdays = (dayzpath + '/art')
metzdays = (dayzpath + '/meta')

repathz = (imgdir + '/' + yearz + '/' + monthz + '/' + dayz + '/')
In [54]:
repathz
Out[54]:
'/home/wcmckee/getsdrawndotcom/imgs/15/01/31/'
In [55]:
imgzdays
Out[55]:
'/home/wcmckee/getsdrawndotcom/imgs15/01/31/art'
In [56]:
repathz
Out[56]:
'/home/wcmckee/getsdrawndotcom/imgs/15/01/31/'
In [57]:
def ospacheck():
    if os.path.isdir(imgdir + yearz) == True:
        print 'its true'
    else:
        print 'its false'
        os.mkdir(imgdir + yearz)
    
In [58]:
ospacheck()
its false
In [47]:
#if os.path.isdir(imgzdir + yearz) == True:
#    print 'its true'
#else:
#    print 'its false'
#    os.mkdir(imgzdir + yearz)
In [48]:
lizmon = ['monzpath', 'dayzpath', 'imgzdays', 'rmgzdays', 'metzdays']

Something is wrong with the script and it's no longer creating these dir in the correct folder. How did this break? Fixed that but problems with it Getting error: OSError: [Errno 17] File exists: '/home/wcmckee/getsdrawndotcom/imgs/15/01' If the file exists it should be skipping over it, thats why it has the os.path.isdir == True: print its true else print its false, and make the dir

In [231]:
if os.path.isdir(monzpath) == True:
    print 'its true'
else:
    print 'its false'
    #os.mkdir('/home/wcmckee/getsdrawndotcom/' + monzpath)
its true
In [230]:
if os.path.isdir(dayzpath) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(dayzpath)

if os.path.isdir(imgzdays) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(imgzdays)
    
if os.path.isdir(rmgzdays) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(rmgzdays)
    
if os.path.isdir(metzdays) == True:
    print 'its true'
else:
    print 'its false'
    os.mkdir(metzdays)
its true
its true
its true
its true
In [158]:
#for liz in lizmon:
#    if os.path.isdir(liz) == True:
##        print 'its true'
 #   else:
#        print 'its false'
#        os.mkdir(liz)
In [159]:
fullhom = ('/home/wcmckee/getsdrawndotcom/')
In [160]:
#artlist
In [161]:
httpad = ('http://getsdrawn.com/imgs')
In [162]:
#im = Image.new("RGB", (512, 512), "white")
#im.save(file + ".thumbnail", "JPEG")
In [163]:
rmgzdays = (dayzpath + '/reference')
imgzdays = (dayzpath + '/art')
metzdays = (dayzpath + '/meta')
In [165]:
os.chdir(metzdays)
In [166]:
metadict = dict()

if i save the data to the file how am i going to get it to update as the post is archieved. Such as up and down votes.

In [167]:
for lisz in lisrgc:
    metadict.update({'up': lisz.ups})
    metadict.update({'down': lisz.downs})
    metadict.update({'title': lisz.title})
    metadict.update({'created': lisz.created})
    #metadict.update({'createdutc': lisz.created_utc})
    #print lisz.ups
    #print lisz.downs
    #print lisz.created
    #print lisz.comments
In [168]:
metadict
Out[168]:
{'created': 1420509834.0,
 'down': 0,
 'title': u"I'm getting married and would love a portrait of Myself and my Fiancee",
 'up': 0}

Need to save json object.

Dict is created but it isnt saving. Looping through lisrgc twice, should only require the one loop.

Cycle through lisr and append to dict/concert to json, and also cycle through lisr.author meta folders saving the json that was created.

In [170]:
for lisr in lisrgc:
    gtdrndic.update({'title': lisr.title})
    lisauth.append(str(lisr.author))
    for osliz in os.listdir(metzdays):
        with open(str(lisr.author) + '.meta', "w") as f:
            rstrin = lisr.title.encode('ascii', 'ignore').decode('ascii')
            #print matdict
            #metadict = dict()
            #for lisz in lisrgc:
            #    metadict.update({'up': lisz.ups})
            #    metadict.update({'down': lisz.downs})
            #    metadict.update({'title': lisz.title})
            #    metadict.update({'created': lisz.created})
            f.write(rstrin)
In [171]:
#matdict

I have it creating a meta folder and creating/writing username.meta files. It wrote 'test' in each folder, but now it writes the photo author title of post.. the username/image data. It should be writing more than author title - maybe upvotes/downvotes, subreddit, time published etc.

In [172]:
#os.listdir(dayzpath)

Instead of creating these white images, why not download the art replies of the reference photo.

In [173]:
#for lisa in lisauth:
#    #print lisa + '-line.png'
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-line.png')
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-bw.png')

    #print lisa + '-bw.png'
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-colour.png')

    #print lisa + '-colour.png'
In [174]:
os.listdir('/home/wcmckee/getsdrawndotcom/imgs')
Out[174]:
['getsdrawn-bw.png', '12', '15', '14']
In [175]:
#lisauth

I want to save the list of usernames that submit images as png files in a dir. Currently when I call the list of authors it returns Redditor(user_name='theusername'). I want to return 'theusername'. Once this is resolved I can add '-line.png' '-bw.png' '-colour.png' to each folder.

In [176]:
#lisr.author
In [177]:
namlis = []
In [178]:
opsinz = open('/home/wcmckee/visignsys/index.meta', 'r')
panz = opsinz.read()
In [180]:
os.chdir(rmgzdays)

Filter the non jpeg/png links. Need to perform request or imgur api to get the jpeg/png files from the link. Hey maybe bs4?

In [180]:
 
In [181]:
from imgurpython import ImgurClient
In [182]:
opps = open('/home/wcmckee/ps.txt', 'r')
opzs = open('/home/wcmckee/ps2.txt', 'r')
oprd = opps.read()
opzrd = opzs.read()
In [183]:
client = ImgurClient(oprd, opzrd)

# Example request
#items = client.gallery()
#for item in items:
#    print(item.link)
    

#itz = client.get_album_images()
In [184]:
galim = client.get_image('SBaV275')
In [185]:
galim.size
Out[185]:
1947098
In [186]:
gelim = client.get_album_images('LTDJ9')
In [187]:
gelim
Out[187]:
[<imgurpython.imgur.models.image.Image instance at 0xafa02120>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f18c8>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1238>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1d78>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1b70>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1d28>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1a80>,
 <imgurpython.imgur.models.image.Image instance at 0xaf9f1e40>]
In [188]:
from urlparse import urlparse
In [189]:
linklis = []

I need to get the image ids from each url. Strip the http://imgur.com/ from the string. The gallery id is the random characters after. if it's an album a is added. if multi imgs then , is used to seprate.

Doesnt currently work.

Having problems with mixed /a/etwet and wetfwet urls. Using .strip('/') to remove forward slash in front of path.

In [190]:
pathlis = []
In [191]:
for rdz in lisrgc:
    if 'http://imgur.com/' in rdz.url:
        print rdz.url
        parsed = urlparse(rdz.url)
        print parsed.path.strip('/')
        pathlis.append(parsed.path.strip('/'))
        #for pared in parsed.path:
        #    print pared.strip('/')
        #itgar = client.gallery_item(parsed.path.strip('/'))
        #itz = client.get_album_images(parsed.path.strip('a/'))
#        reimg = requests.get(rdz.url)
##        retxt = reimg.text
#        souptxt = BeautifulSoup(''.join(retxt))
#        soupurz = souptxt.findAll('img')
#        for soupuz in soupurz:
#            imgurl = soupuz['src']
#            print imgurl
#            linklis.append(imgurl)
            
            #try:
            #    imzdata = requests.get(imgurl)
http://imgur.com/Bmr5mit
Bmr5mit
http://imgur.com/putrdyx
putrdyx
http://imgur.com/xXI94VS
xXI94VS
http://imgur.com/sEJtVao
sEJtVao
http://imgur.com/a/h7aS9
a/h7aS9
http://imgur.com/m9agv3W
m9agv3W
http://imgur.com/a/SBSkJ
a/SBSkJ
http://imgur.com/OTRADPu
OTRADPu
http://imgur.com/Z1XdGv3
Z1XdGv3
http://imgur.com/pgqisMy
pgqisMy
http://imgur.com/9p61Jtb
9p61Jtb
http://imgur.com/a/IkZ37
a/IkZ37
http://imgur.com/Wh7hbU2
Wh7hbU2
http://imgur.com/Jr85F4I
Jr85F4I
http://imgur.com/VJOonmM
VJOonmM
In [192]:
pathlis
Out[192]:
[u'Bmr5mit',
 u'putrdyx',
 u'xXI94VS',
 u'sEJtVao',
 u'a/h7aS9',
 u'm9agv3W',
 u'a/SBSkJ',
 u'OTRADPu',
 u'Z1XdGv3',
 u'pgqisMy',
 u'9p61Jtb',
 u'a/IkZ37',
 u'Wh7hbU2',
 u'Jr85F4I',
 u'VJOonmM']
In [193]:
noalis = []
In [194]:
for pathl in pathlis:
    if 'a/' in pathl:
        print 'a found'
    else:
        noalis.append(pathl)
a found
a found
a found
In [195]:
#if 'a/' in pathlis:
#    print 'a found'
#else:
#    noalis.append(pathlis)
In [196]:
for noaz in noalis:
    print noaz
    #itgar = client.gallery_item()
Bmr5mit
putrdyx
xXI94VS
sEJtVao
m9agv3W
OTRADPu
Z1XdGv3
pgqisMy
9p61Jtb
Wh7hbU2
Jr85F4I
VJOonmM
In [197]:
linklis
Out[197]:
[]
In [198]:
if '.jpg' in linklis:
    print 'yes'
else:
    print 'no'
no
In [199]:
#panz()
for rdz in lisrgc:
    (rdz.title)
    #a(rdz.url)
    if 'http://i.imgur.com' in rdz.url:
        #print rdz.url
        print (rdz.url)
        url = rdz.url
        response = requests.get(url, stream=True)
        with open(str(rdz.author) + '-reference.png', 'wb') as out_file:
            shutil.copyfileobj(response.raw, out_file)
            del response
http://i.imgur.com/zUmgwd7.jpg
http://i.imgur.com/iLDvh9P.jpg?1
http://i.imgur.com/kI7enKC.jpg
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-199-da5355938f18> in <module>()
      9         response = requests.get(url, stream=True)
     10         with open(str(rdz.author) + '-reference.png', 'wb') as out_file:
---> 11             shutil.copyfileobj(response.raw, out_file)
     12             del response

/usr/lib/python2.7/shutil.pyc in copyfileobj(fsrc, fdst, length)
     47     """copy data from file-like object fsrc to file-like object fdst"""
     48     while 1:
---> 49         buf = fsrc.read(length)
     50         if not buf:
     51             break

/usr/lib/python2.7/dist-packages/urllib3/response.pyc in read(self, amt, decode_content, cache_content)
    172             else:
    173                 cache_content = False
--> 174                 data = self._fp.read(amt)
    175                 if amt != 0 and not data:  # Platform-specific: Buggy versions of Python.
    176                     # Close the connection when no data is returned

/usr/lib/python2.7/httplib.pyc in read(self, amt)
    565         # connection, and the user is reading more bytes than will be provided
    566         # (for example, reading in 1k chunks)
--> 567         s = self.fp.read(amt)
    568         if not s and amt:
    569             # Ideally, we would raise IncompleteRead if the content-length

/usr/lib/python2.7/socket.pyc in read(self, size)
    378                 # fragmentation issues on many platforms.
    379                 try:
--> 380                     data = self._sock.recv(left)
    381                 except error, e:
    382                     if e.args[0] == EINTR:

KeyboardInterrupt: 
In [200]:
apsize = []
In [201]:
aptype = []
In [202]:
basewidth = 600
In [203]:
imgdict = dict()
In [205]:
for rmglis in os.listdir(rmgzdays):
    #print rmglis
    im = Image.open(rmglis)
    #print im.size
    imgdict.update({rmglis : im.size})
    #im.thumbnail(size, Image.ANTIALIAS)
    #im.save(file + ".thumbnail", "JPEG")
    apsize.append(im.size)
    aptype.append(rmglis)
In [206]:
#for imdva in imgdict.values():
    #print imdva
    #for deva in imdva:
        #print deva
     #   if deva < 1000:
      #      print 'omg less than 1000'
       # else:
        #    print 'omg more than 1000'
         #   print deva / 2
            #print imgdict.values
            # Needs to update imgdict.values with this new number. Must halve height also.
In [207]:
#basewidth = 300
#img = Image.open('somepic.jpg')
#wpercent = (basewidth/float(img.size[0]))
#hsize = int((float(img.size[1])*float(wpercent)))
#img = img.resize((basewidth,hsize), PIL.Image.ANTIALIAS)
#img.save('sompic.jpg')
In [208]:
#os.chdir(metzdays)
In [208]:
 
In [209]:
#for numz in apsize:
#    print numz[0]
 #   if numz[0] > 800:
#        print ('greater than 800')
#    else:
#        print ('less than 800!')
In [210]:
reliz = []
In [212]:
for refls in os.listdir(rmgzdays):
    #print rmgzdays + refls
    reliz.append(rmgzdays + '/' + refls)
In [213]:
reliz
Out[213]:
['/home/wcmckee/getsdrawndotcom/imgs/15/01/06/reference/equinox5005-reference.png',
 '/home/wcmckee/getsdrawndotcom/imgs/15/01/06/reference/lennon3862-reference.png',
 '/home/wcmckee/getsdrawndotcom/imgs/15/01/06/reference/Bitches_BeCrazy-reference.png']
In [214]:
aptype
Out[214]:
['equinox5005-reference.png',
 'lennon3862-reference.png',
 'Bitches_BeCrazy-reference.png']
In [215]:
opad = open('/home/wcmckee/ad.html', 'r')
In [216]:
opred = opad.read()
In [217]:
str2 = opred.replace("\n", "")
In [218]:
str2
Out[218]:
'<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script><!-- header --><ins class="adsbygoogle"     style="display:inline-block;width:970px;height:250px"     data-ad-client="ca-pub-2716205862465403"     data-ad-slot="3994067148"></ins><script>(adsbygoogle = window.adsbygoogle || []).push({});</script>'
In [219]:
doc = dominate.document(title='GetsDrawn')

with doc.head:
    link(rel='stylesheet', href='style.css')
    script(type ='text/javascript', src='script.js')
    str(str2)
    
    with div():
        attr(cls='header')
        h1('GetsDrawn')
        p(img('imgs/getsdrawn-bw.png', src='imgs/getsdrawn-bw.png'))
        #p(img('imgs/15/01/02/ReptileLover82-reference.png', src= 'imgs/15/01/02/ReptileLover82-reference.png'))
        h1('Updated ', strftime("%a, %d %b %Y %H:%M:%S +0000", gmtime()))
        p(panz)
        p(bodycom)
    
    

with doc:
    with div(id='body').add(ol()):
        for rdz in reliz:
            #h1(rdz.title)
            #a(rdz.url)
            #p(img(rdz, src='%s' % rdz))
            #print rdz
            p(img(rdz, src = rdz))
            p(rdz)


                
            #print rdz.url
            #if '.jpg' in rdz.url:
            #    img(rdz.urlz)
            #else:
            #    a(rdz.urlz)
            #h1(str(rdz.author))
            
            #li(img(i.lower(), src='%s' % i))

    with div():
        attr(cls='body')
        p('GetsDrawn is open source')
        a('https://github.com/getsdrawn/getsdrawndotcom')
        a('https://reddit.com/r/redditgetsdrawn')

#print doc
In [220]:
docre = doc.render()
In [221]:
#s = docre.decode('ascii', 'ignore')
In [222]:
yourstring = docre.encode('ascii', 'ignore').decode('ascii')
In [223]:
indfil = ('/home/wcmckee/getsdrawndotcom/index.html')
In [224]:
mkind = open(indfil, 'w')
mkind.write(yourstring)
mkind.close()
In [225]:
#os.system('scp -r /home/wcmckee/getsdrawndotcom/ wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom')
In [226]:
#rsync -azP source destination
In [80]:
#updatehtm = raw_input('Update index? Y/n')
#updateref = raw_input('Update reference? Y/n')

#if 'y' or '' in updatehtm:
#    os.system('scp -r /home/wcmckee/getsdrawndotcom/index.html wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/index.html')
#elif 'n' in updatehtm:
#    print 'not uploading'
#if 'y' or '' in updateref:
#    os.system('rsync -azP /home/wcmckee/getsdrawndotcom/ wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/')
In [81]:
os.system('scp -r /home/wcmckee/getsdrawndotcom/index.html wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/index.html')
Out[81]:
0
In [553]:
#os.system('scp -r /home/wcmckee/getsdrawndotcom/style.css wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/style.css')
In [553]:
 
In [321]:
 
In [138]:
 
In [138]:
 
In [ ]:
 

lcacoffee

lcacoffee

script that displays coffees sold by hour at lca2015. Currently it opens a .json file and converts it into a python dict.

It's missing monday data.

sale by hour is: key - the hour (24hr). Value is total paid sales (people who ran out of vouchers). Number in brackets is total number of coffees sold.

Number in brackets - Number not in brackets = amount of coffees sold with vouchers.

Each sale[day].json file contains a json objects. Important keys/values are:

Product : cappuccino/flat white etc...

Tags : These are taged with coffee followed by ; and if its free ;.

count : amount of these coffees sold.

SKU : ? guess its a ID?

Sales ($) : Amount of dollars from selling coffee

Tax ($) : Amount of tax paid from selling the coffee

Revenue : This is Sales + Tax

Costs ($) : Cost to make the coffees.

Revenue ($) : Amount of revenue made from selling the coffee

Margin ($) : Percent of money made from selling coffee.

There are other keys but the value of them is empty. No point dealing with them.

Open up all the sale[day].json files and create a all salesfile. Append all together.

In [80]:
import json
import os
import pandas
import getpass
In [81]:
theuser = getpass.getuser()

Need to open all the sale[day].json files and append data.

In [117]:
jsonfold = ('/home/' + theuser + '/github/lcacoffee/')
alldata = ('salebyhour.json')
tueda = ('saletues.json')
weda = ('saleweds.json')
thura = ('salethurs.json')
fria = ('salefri.json')
salhr = (jsonfold + alldata)
salajs = ('/home/wcmckee/cofres/salesall.json')
In [83]:
lisz = [tueda, weda, thura, fria]
opaz = open('/home/wcmckee/cofres/salesall.json', 'w')

filza = ('/home/' + theuser + '/github/lcacoffee/' + lis)
In [118]:
for lis in lisz:
    opaz = open(salajs, 'a')
    print ('/home/' + theuser + '/github/lcacoffee/' + lis)
    opdayz = open(filza, 'r')
    opaz.write(str(opdayz.read()))
    opaz.close()
    opdayz.close()
/home/wcmckee/github/lcacoffee/saletues.json
/home/wcmckee/github/lcacoffee/saleweds.json
/home/wcmckee/github/lcacoffee/salethurs.json
/home/wcmckee/github/lcacoffee/salefri.json
In [85]:
opaz.close()
In [119]:
opzall = open(salajs, 'r')

The daysales appended into the allsales file are in fact lists.

Options include removing the lists and turning it all into one big json object

Appending all the data into one day.

eg: Product: "cappuccino" Count: 450 (total of week).

File needs steriozie removing the \n

In [120]:
opzall.read()
Out[120]:
'[\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n][\r\n  {\r\n    "Outlet":"LCA",\r\n    "Product":"Cappuccino",\r\n    "SKU": 10001,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Cappuccino FREE",\r\n    "SKU": 10009,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 63,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte",\r\n    "SKU": 10003,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 1,\r\n    "Sales ($)": 3.4782600000,\r\n    "Tax ($)": 0.5217400000,\r\n    "Sales incl. tax ($)": 4.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 1.0000000000,\r\n    "Revenue ($)": 2.4782600000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Chai Latte FREE",\r\n    "SKU": 10010,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 9,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Espresso FREE",\r\n    "SKU": 10032,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" ",\r\n    "Supplier Code":" ",\r\n    "Count": 14,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White",\r\n    "SKU": 10000,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 57,\r\n    "Sales ($)": 198.2608200000,\r\n    "Tax ($)": 29.7391800000,\r\n    "Sales incl. tax ($)": 228.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 57.0000000000,\r\n    "Revenue ($)": 141.2608200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Flat White FREE",\r\n    "SKU": 10011,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 249,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate",\r\n    "SKU": 10005,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 7,\r\n    "Sales ($)": 24.3478200000,\r\n    "Tax ($)": 3.6521800000,\r\n    "Sales incl. tax ($)": 28.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 7.0000000000,\r\n    "Revenue ($)": 17.3478200000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Hot Chocolate FREE",\r\n    "SKU": 10012,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 121,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black",\r\n    "SKU": 10006,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 25,\r\n    "Sales ($)": 86.9565000000,\r\n    "Tax ($)": 13.0435000000,\r\n    "Sales incl. tax ($)": 100.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 25.0000000000,\r\n    "Revenue ($)": 61.9565000000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Long Black FREE",\r\n    "SKU": 10014,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 62,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha",\r\n    "SKU": 10004,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee",\r\n    "Supplier Code":" ",\r\n    "Count": 13,\r\n    "Sales ($)": 45.2173800000,\r\n    "Tax ($)": 6.7826200000,\r\n    "Sales incl. tax ($)": 52.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 13.0000000000,\r\n    "Revenue ($)": 32.2173800000,\r\n    "Margin (%)": 71.24\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Mocha FREE",\r\n    "SKU": 10015,\r\n    "Brand":" ",\r\n    "Supplier":" ",\r\n    "Type":" ",\r\n    "Tags":" coffee; free",\r\n    "Supplier Code":" ",\r\n    "Count": 81,\r\n    "Sales ($)": 0.0000000000,\r\n    "Tax ($)": 0.0000000000,\r\n    "Sales incl. tax ($)": 0.00,\r\n    "Discount ($)": 0,\r\n    "Cost ($)": 0.0000000000,\r\n    "Revenue ($)": 0.0000000000,\r\n    "Margin (%)": 0.00\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  },\r\n  {\r\n    "Outlet":"",\r\n    "Product":"Grand Total",\r\n    "SKU":null,\r\n    "Brand":"",\r\n    "Supplier":"",\r\n    "Type":"",\r\n    "Tags":"",\r\n    "Supplier Code":"",\r\n    "Count":715.00,\r\n    "Sales ($)": 403.44,\r\n    "Tax ($)": 60.50,\r\n    "Sales incl. tax ($)": 463.94,\r\n    "Discount ($)": 0.00,\r\n    "Cost ($)": 116.00,\r\n    "Revenue ($)": 287.44,\r\n    "Margin (%)": 71.24727\r\n  }\r\n]'
In [86]:
opcvs = open(salhr, 'r')
In [87]:
opzrd = opcvs.read()
In [88]:
jdunp = json.loads(opzrd)
In [89]:
valia = []
In [90]:
#pandas.read_json(jdunp)
In [91]:
jdunp.count(int)
Out[91]:
0
In [92]:
len(jdunp)
Out[92]:
6

ok if i cycle through jdunp between 0 and 23 i get the results.

cycle through ints but as a string. must add ' '

Break down coffee sales by hour.

Ignore/delete hours with zero sales. Need to create new dict with this data.

How would it look?

In [93]:
for numtwn in range(0,24):
    print "'" + str(numtwn) + "'"
'0'
'1'
'2'
'3'
'4'
'5'
'6'
'7'
'8'
'9'
'10'
'11'
'12'
'13'
'14'
'15'
'16'
'17'
'18'
'19'
'20'
'21'
'22'
'23'
In [94]:
for jdr in jdunp:
    print jdr['0']
- -
- -
- -
- -
- -
- -
In [95]:
for numtwn in range(0,24):
        print "'" + str(numtwn) + "'"
'0'
'1'
'2'
'3'
'4'
'5'
'6'
'7'
'8'
'9'
'10'
'11'
'12'
'13'
'14'
'15'
'16'
'17'
'18'
'19'
'20'
'21'
'22'
'23'
In [96]:
for dej in jdunp:
    print dej.values()
    valia.append(dej.values())
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u' 11 Jan 2015', u'', u'Sun', u'- -', u'- -', u'- -', u'4 (1)', u'0 (9)', u'12 (30)', u'0 (3)', u'8 (9)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'8 (9)', u'- -', u'40 (31)', u'44 (123)', u' 12 Jan 2015', u'', u'Mon', u'4 (13)', u'20 (118)', u'8 (52)', u'12 (34)', u'12 (46)', u'4 (33)', u'- -', u'0 (1)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'4 (17)', u'- -', u'0 (23)', u'24 (109)', u' 13 Jan 2015', u'', u'Tue', u'8 (25)', u'32 (101)', u'12 (55)', u'16 (43)', u'20 (100)', u'12 (43)', u'- -', u'4 (3)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'4 (19)', u'- -', u'8 (24)', u'20 (132)', u' 14 Jan 2015', u'', u'Wed', u'12 (39)', u'32 (133)', u'56 (64)', u'16 (36)', u'52 (83)', u'44 (38)', u'- -', u'- -', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'8 (20)', u'- -', u'20 (25)', u'68 (115)', u' 15 Jan 2015', u'', u'Thu', u'40 (40)', u'132 (136)', u'52 (55)', u'36 (50)', u'72 (103)', u'32 (49)', u'- -', u'4 (6)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'12 (21)', u'- -', u'8 (15)', u'104 (128)', u' 16 Jan 2015', u'', u'Fri', u'28 (41)', u'112 (131)', u'40 (76)', u'40 (54)', u'92 (102)', u'28 (64)', u'- -', u'- -', u'- -', u'- -']
In [96]:
 
In [97]:
dezrand = len(valia)
In [98]:
azlis = []
In [99]:
for vals in valia:    
    print vals
    azlis.append(vals)
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u' 11 Jan 2015', u'', u'Sun', u'- -', u'- -', u'- -', u'4 (1)', u'0 (9)', u'12 (30)', u'0 (3)', u'8 (9)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'8 (9)', u'- -', u'40 (31)', u'44 (123)', u' 12 Jan 2015', u'', u'Mon', u'4 (13)', u'20 (118)', u'8 (52)', u'12 (34)', u'12 (46)', u'4 (33)', u'- -', u'0 (1)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'4 (17)', u'- -', u'0 (23)', u'24 (109)', u' 13 Jan 2015', u'', u'Tue', u'8 (25)', u'32 (101)', u'12 (55)', u'16 (43)', u'20 (100)', u'12 (43)', u'- -', u'4 (3)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'4 (19)', u'- -', u'8 (24)', u'20 (132)', u' 14 Jan 2015', u'', u'Wed', u'12 (39)', u'32 (133)', u'56 (64)', u'16 (36)', u'52 (83)', u'44 (38)', u'- -', u'- -', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'8 (20)', u'- -', u'20 (25)', u'68 (115)', u' 15 Jan 2015', u'', u'Thu', u'40 (40)', u'132 (136)', u'52 (55)', u'36 (50)', u'72 (103)', u'32 (49)', u'- -', u'4 (6)', u'- -', u'- -']
[u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'- -', u'12 (21)', u'- -', u'8 (15)', u'104 (128)', u' 16 Jan 2015', u'', u'Fri', u'28 (41)', u'112 (131)', u'40 (76)', u'40 (54)', u'92 (102)', u'28 (64)', u'- -', u'- -', u'- -', u'- -']

I need to filter the - - from the results. I really only need the values that have numbers.

Take number in brackets away from number not in brackets. The number in brackets is total amount of coffees sold. The number not in brackets is amount of volchers used. The number that I get when i take away is the coffee sold without volchers.

New dict that shows only the times that coffee were sold and the amount of coffgfges that were solf. Maybe that would works.

In [100]:
betra = []
In [101]:
for azl in azlis:
    betra.append(azl)
In [102]:
anoe = []
anez = []
In [103]:
for betr in betra:
    betr.append(anoe)
In [104]:
for deta in betr:
    #print deta
    if '- -' in deta:
        print deta
    else:
        anez.append(deta)
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
In [105]:
fdic = []
In [105]:
 
In [106]:
for resut in anez:
    print resut
    fdic.append(resut)
12 (21)
8 (15)
104 (128)
 16 Jan 2015

Fri
28 (41)
112 (131)
40 (76)
40 (54)
92 (102)
28 (64)
[]

How come it is only adding the wednesday data in the results. It needs to have all the datas.

Needs to take the number in brackets away from the number not in brackets.

In [107]:
fdic
Out[107]:
[u'12 (21)',
 u'8 (15)',
 u'104 (128)',
 u' 16 Jan 2015',
 u'',
 u'Fri',
 u'28 (41)',
 u'112 (131)',
 u'40 (76)',
 u'40 (54)',
 u'92 (102)',
 u'28 (64)',
 []]
In [108]:
optue = open('/home/wcmckee/Downloads/saletues.json', 'r')
In [109]:
rdtue = optue.read()
In [110]:
tuejs = json.loads(rdtue)
In [111]:
tuejs
Out[111]:
[{u'Brand': u' ',
  u'Cost ($)': 2.0,
  u'Count': 2,
  u'Discount ($)': 0,
  u'Margin (%)': 71.24,
  u'Outlet': u'LCA',
  u'Product': u'Cappuccino',
  u'Revenue ($)': 4.95652,
  u'SKU': 10001,
  u'Sales ($)': 6.95652,
  u'Sales incl. tax ($)': 8.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee',
  u'Tax ($)': 1.04348,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 69,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Cappuccino FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10009,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 15,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Chai Latte FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10010,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 7,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Espresso FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10032,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' ',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 17.0,
  u'Count': 17,
  u'Discount ($)': 0,
  u'Margin (%)': 71.24,
  u'Outlet': u'',
  u'Product': u'Flat White',
  u'Revenue ($)': 42.13042,
  u'SKU': 10000,
  u'Sales ($)': 59.13042,
  u'Sales incl. tax ($)': 68.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee',
  u'Tax ($)': 8.86958,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 242,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Flat White FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10011,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 5.0,
  u'Count': 5,
  u'Discount ($)': 0,
  u'Margin (%)': 71.24,
  u'Outlet': u'',
  u'Product': u'Hot Chocolate',
  u'Revenue ($)': 12.3913,
  u'SKU': 10005,
  u'Sales ($)': 17.3913,
  u'Sales incl. tax ($)': 20.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee',
  u'Tax ($)': 2.6087,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 68,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Hot Chocolate FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10012,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 6.0,
  u'Count': 6,
  u'Discount ($)': 0,
  u'Margin (%)': 71.24,
  u'Outlet': u'',
  u'Product': u'Long Black',
  u'Revenue ($)': 14.86956,
  u'SKU': 10006,
  u'Sales ($)': 20.86956,
  u'Sales incl. tax ($)': 24.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee',
  u'Tax ($)': 3.13044,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 73,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Long Black FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10014,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 4,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Macchiato FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10030,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' ',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 3.0,
  u'Count': 3,
  u'Discount ($)': 0,
  u'Margin (%)': 71.24,
  u'Outlet': u'',
  u'Product': u'Mocha',
  u'Revenue ($)': 7.43478,
  u'SKU': 10004,
  u'Sales ($)': 10.43478,
  u'Sales incl. tax ($)': 12.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee',
  u'Tax ($)': 1.56522,
  u'Type': u' '},
 {u'Brand': u' ',
  u'Cost ($)': 0.0,
  u'Count': 61,
  u'Discount ($)': 0,
  u'Margin (%)': 0.0,
  u'Outlet': u'',
  u'Product': u'Mocha FREE',
  u'Revenue ($)': 0.0,
  u'SKU': 10015,
  u'Sales ($)': 0.0,
  u'Sales incl. tax ($)': 0.0,
  u'Supplier': u' ',
  u'Supplier Code': u' ',
  u'Tags': u' coffee; free',
  u'Tax ($)': 0.0,
  u'Type': u' '},
 {u'Brand': u'',
  u'Cost ($)': 33.0,
  u'Count': 572.0,
  u'Discount ($)': 0.0,
  u'Margin (%)': 71.24433,
  u'Outlet': u'',
  u'Product': u'Total',
  u'Revenue ($)': 81.76,
  u'SKU': None,
  u'Sales ($)': 114.76,
  u'Sales incl. tax ($)': 131.95,
  u'Supplier': u'',
  u'Supplier Code': u'',
  u'Tags': u'',
  u'Tax ($)': 17.19,
  u'Type': u''},
 {u'Brand': u'',
  u'Cost ($)': 33.0,
  u'Count': 572.0,
  u'Discount ($)': 0.0,
  u'Margin (%)': 71.24433,
  u'Outlet': u'',
  u'Product': u'Grand Total',
  u'Revenue ($)': 81.76,
  u'SKU': None,
  u'Sales ($)': 114.76,
  u'Sales incl. tax ($)': 131.95,
  u'Supplier': u'',
  u'Supplier Code': u'',
  u'Tags': u'',
  u'Tax ($)': 17.19,
  u'Type': u''}]
In [112]:
saltax = []
In [113]:
for bran in tuejs:
    #print bran['Revenue ($)']
    print bran['Sales incl. tax ($)']
    saltax.append(bran['Sales incl. tax ($)'])
8.0
0.0
0.0
0.0
68.0
0.0
20.0
0.0
24.0
0.0
0.0
12.0
0.0
131.95
131.95
In [114]:
satxtot = sum(saltax)
In [115]:
satxtot
Out[115]:
395.9
In [116]:
for bran in tuejs:
    #print bran['Revenue ($)']
    print bran['Product']
Cappuccino
Cappuccino FREE
Chai Latte FREE
Espresso FREE
Flat White
Flat White FREE
Hot Chocolate
Hot Chocolate FREE
Long Black
Long Black FREE
Macchiato FREE
Mocha
Mocha FREE
Total
Grand Total
In [116]:
 
In [116]:
 
In [ ]:
 

getsdrawn

GetsDrawn DotCom

This is a python script to generate the website GetsDrawn. It takes data from /r/RedditGetsDrawn and makes something awesome.

The script has envolved and been rewritten several times.

The first script for rgdsnatch was written after I got banned from posting my artwork on /r/RedditGetsDrawn. The plan was to create a new site that displayed stuff from /r/RedditGetsDrawn.

Currently it only displays the most recent 25 items on redditgetsdrawn. The script looks at the newest 25 reference photos on RedditGetsDrawn. It focuses only on jpeg/png images and ignores and links to none .jpg or .png ending files. It is needed to instead of ignoring them files - get the image or images in some cases, from the link. The photos are always submitted from imgur. Still filter out the i.imgur files, but take the links and filter them through a python imgur module returning the .jpeg or .png files.

This is moving forward from rgdsnatch.py because I am stuck on it.

TODO

Fix the links that don't link to png/jpeg and link to webaddress. Needs to get the images that are at that web address and embed them.

Display artwork submitted under the images.

Upload artwork to user. Sends them a message on redditgetsdrawn with links.

More pandas

Saves reference images to imgs/year/month/day/reference/username-reference.png

Saves art images to imgs/year/month/day/art/username-line-bw-colour.png

Creates index.html file with: Title of site and logo: GetsDrawn Last updated date and time.

Path of image file /imgs/year/month/day/username-reference.png. (This needs changed to just their username).

Save off .meta data from reddit of each photo, saving it to reference folder. username-yrmnthday.meta - contains info such as author, title, upvotes, downvotes. Currently saving .meta files to a meta folder - along side art and reference.

Folder sorting system of files. websitename/index.html-style.css-imgs/YEAR(15)-MONTH(2)-DAY(4)/art-reference-meta Inside art folder Currently it generates USERNAME-line/bw/colour.png 50/50 white files. Maybe should be getting art replies from reddit?

Inside reference folder Reference fold is working decent. it creates USERNAME-reference.png / jpeg files.

Currently saves username-line-bw-colour.png to imgs folder. Instead get it to save to imgs/year/month/day/usernames.png. Script checks the year/month/day and if folder isnt created, it creates it. If folder is there, exit. Maybe get the reference image and save it with the line/bw/color.pngs

The script now filters the jpeg and png image and skips links to imgur pages. This needs to be fixed by getting the images from the imgur pages. It renames the image files to the redditor username followed by a -reference tag (and ending with png of course). It opens these files up with PIL and checks the sizes. It needs to resize the images that are larger than 800px to 800px. These images need to be linked in the index.html instead of the imgur altenatives.

Instead of the jpeg/png files on imgur they are downloaded to the server with this script.

Filter through as images are getting downloaded and if it has been less than certain time or if the image has been submitted before

Extending the subreddits it gets data from to cycle though a list, run script though list of subreddits.

Browse certain days - Current day by default but option to scroll through other days.

Filters - male/female/animals/couples etc Function that returns only male portraits. tags to add to photos. Filter images with tags

In [2]:
import os 
import requests
from bs4 import BeautifulSoup
import re
import json
import time
import praw
import dominate
from dominate.tags import * 
from time import gmtime, strftime
#import nose
#import unittest
import numpy as np
import pandas as pd
from pandas import *
from PIL import Image
from pprint import pprint
#import pyttsx
import shutil
In [3]:
gtsdrndir = ('/home/wcmckee/getsdrawndotcom')
In [4]:
os.chdir(gtsdrndir)
In [5]:
r = praw.Reddit(user_agent='getsdrawndotcom')
In [6]:
#getmin = r.get_redditor('itwillbemine')
In [7]:
#mincom = getmin.get_comments()
In [8]:
#engine = pyttsx.init()

#engine.say('The quick brown fox jumped over the lazy dog.')
#engine.runAndWait()
In [9]:
#shtweet = []
In [10]:
#for mi in mincom:
#    print mi
#    shtweet.append(mi)
In [11]:
bodycom = []
bodyicv = dict()
In [12]:
#beginz = pyttsx.init()
In [13]:
#for shtz in shtweet:
#    print shtz.downs
#    print shtz.ups
#    print shtz.body
#    print shtz.replies
    #beginz.say(shtz.author)
    #beginz.say(shtz.body)
    #beginz.runAndWait()
    
#    bodycom.append(shtz.body)
    #bodyic
In [14]:
#bodycom 
In [15]:
getnewr = r.get_subreddit('redditgetsdrawn')
In [16]:
rdnew = getnewr.get_new()
In [17]:
lisrgc = []
lisauth = []
In [18]:
for uz in rdnew:
    #print uz
    lisrgc.append(uz)
In [19]:
gtdrndic = dict()
In [20]:
imgdir = ('/home/wcmckee/getsdrawndotcom/imgs')
In [21]:
artlist = os.listdir(imgdir)
In [22]:
from time import time
In [23]:
yearz = strftime("%y", gmtime())
monthz = strftime("%m", gmtime())
dayz = strftime("%d", gmtime())


#strftime("%y %m %d", gmtime())
In [24]:
imgzdir = ('imgs/')
yrzpat = (imgzdir + yearz)
monzpath = (yrzpat + '/' + monthz)
dayzpath = (monzpath + '/' + dayz)
rmgzdays = (dayzpath + '/reference')
imgzdays = (dayzpath + '/art')
metzdays = (dayzpath + '/meta')

repathz = ('imgs/' + yearz + '/' + monthz + '/' + dayz + '/')
In [25]:
metzdays
Out[25]:
'imgs/15/01/05/meta'
In [26]:
imgzdays
Out[26]:
'imgs/15/01/05/art'
In [27]:
repathz
Out[27]:
'imgs/15/01/05/'
In [28]:
def ospacheck():
    if os.path.isdir(imgzdir + yearz) == True:
        print 'its true'
    else:
        print 'its false'
        os.mkdir(imgzdir + yearz)
    
In [29]:
ospacheck()
its true
In [30]:
#if os.path.isdir(imgzdir + yearz) == True:
#    print 'its true'
#else:
#    print 'its false'
#    os.mkdir(imgzdir + yearz)
In [117]:
lizmon = ['monzpath', 'dayzpath', 'imgzdays', 'rmgzdays', 'metzdays']
In [120]:
for liz in lizmon:
    if os.path.isdir(liz) == True:
        print 'its true'
    else:
        print 'its false'
        os.mkdir(liz)
its false
its false
its false
its false
its false
In [36]:
fullhom = ('/home/wcmckee/getsdrawndotcom/')
In [38]:
#artlist
In [39]:
httpad = ('http://getsdrawn.com/imgs')
In [40]:
#im = Image.new("RGB", (512, 512), "white")
#im.save(file + ".thumbnail", "JPEG")
In [41]:
rmgzdays = (dayzpath + '/reference')
imgzdays = (dayzpath + '/art')
metzdays = (dayzpath + '/meta')
In [42]:
os.chdir(fullhom + metzdays)
In [47]:
metadict = dict()

if i save the data to the file how am i going to get it to update as the post is archieved. Such as up and down votes.

In [55]:
for lisz in lisrgc:
    metadict.update({'up': lisz.ups})
    metadict.update({'down': lisz.downs})
    metadict.update({'title': lisz.title})
    metadict.update({'created': lisz.created})
    #metadict.update({'createdutc': lisz.created_utc})
    #print lisz.ups
    #print lisz.downs
    #print lisz.created
    #print lisz.comments
In [56]:
metadict
Out[56]:
{'created': 1420436236.0,
 'createdutc': 1420407436.0,
 'down': 0,
 'title': u"This is my favorite pic of my little ones, would anyone be interested in transforming it into color? I'd love to see your artistic interpretation!",
 'up': 2}

Need to save json object.

Dict is created but it isnt saving. Looping through lisrgc twice, should only require the one loop.

Cycle through lisr and append to dict/concert to json, and also cycle through lisr.author meta folders saving the json that was created.

In [77]:
for lisr in lisrgc:
    gtdrndic.update({'title': lisr.title})
    lisauth.append(str(lisr.author))
    for osliz in os.listdir(fullhom + metzdays):
        with open(str(lisr.author) + '.meta', "w") as f:
            rstrin = lisr.title.encode('ascii', 'ignore').decode('ascii')
            #print matdict
            #metadict = dict()
            #for lisz in lisrgc:
            #    metadict.update({'up': lisz.ups})
            #    metadict.update({'down': lisz.downs})
            #    metadict.update({'title': lisz.title})
            #    metadict.update({'created': lisz.created})
            f.write(rstrin)
In [75]:
#matdict
Out[75]:
{'created': 1420436236.0,
 'down': 0,
 'title': u"This is my favorite pic of my little ones, would anyone be interested in transforming it into color? I'd love to see your artistic interpretation!",
 'up': 2}

I have it creating a meta folder and creating/writing username.meta files. It wrote 'test' in each folder, but now it writes the photo author title of post.. the username/image data. It should be writing more than author title - maybe upvotes/downvotes, subreddit, time published etc.

In [62]:
#os.listdir(dayzpath)

Instead of creating these white images, why not download the art replies of the reference photo.

In [63]:
#for lisa in lisauth:
#    #print lisa + '-line.png'
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-line.png')
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-bw.png')

    #print lisa + '-bw.png'
#    im = Image.new("RGB", (512, 512), "white")
#    im.save(lisa + '-colour.png')

    #print lisa + '-colour.png'
In [64]:
os.listdir('/home/wcmckee/getsdrawndotcom/imgs')
Out[64]:
['getsdrawn-bw.png', '12', '15', '14']
In [65]:
#lisauth

I want to save the list of usernames that submit images as png files in a dir. Currently when I call the list of authors it returns Redditor(user_name='theusername'). I want to return 'theusername'. Once this is resolved I can add '-line.png' '-bw.png' '-colour.png' to each folder.

In [66]:
#lisr.author
In [67]:
namlis = []
In [68]:
opsinz = open('/home/wcmckee/visignsys/index.meta', 'r')
panz = opsinz.read()
In [69]:
os.chdir('/home/wcmckee/getsdrawndotcom/' + rmgzdays)

Filter the non jpeg/png links. Need to perform request or imgur api to get the jpeg/png files from the link. Hey maybe bs4?

In [83]:
 
/usr/local/lib/python2.7/dist-packages/bs4/__init__.py:189: UserWarning: "http://m.imgur.com/uurbzet" looks like a URL. Beautiful Soup is not an HTTP client. You should probably use an HTTP client to get the document behind the URL, and feed that document to Beautiful Soup.
  '"%s" looks like a URL. Beautiful Soup is not an HTTP client. You should probably use an HTTP client to get the document behind the URL, and feed that document to Beautiful Soup.' % markup)
In [130]:
from imgurpython import ImgurClient
In [136]:
opps = open('/home/wcmckee/ps.txt', 'r')
opzs = open('/home/wcmckee/ps2.txt', 'r')
oprd = opps.read()
opzrd = opzs.read()
In [140]:
client = ImgurClient(oprd, opzrd)

# Example request
#items = client.gallery()
#for item in items:
#    print(item.link)
    

#itz = client.get_album_images()
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-140-91dfeecf21cc> in <module>()
      7 
      8 
----> 9 itz = client.get_album_images()

TypeError: get_album_images() takes exactly 2 arguments (1 given)
In [102]:
linklis = []

I need to get the image ids from each url. Strip the http://imgur.com/ from the string. The gallery id is the random characters after. if it's an album a is added. if multi imgs then , is used to seprate.

Doesnt currently work.

In [141]:
for rdz in lisrgc:
    if 'http://imgur.com' in rdz.url:
        print rdz.url
        #itz = client.get_album_images()
#        reimg = requests.get(rdz.url)
##        retxt = reimg.text
#        souptxt = BeautifulSoup(''.join(retxt))
#        soupurz = souptxt.findAll('img')
#        for soupuz in soupurz:
#            imgurl = soupuz['src']
#            print imgurl
#            linklis.append(imgurl)
            
            #try:
            #    imzdata = requests.get(imgurl)
http://imgur.com/SBaV275
http://imgur.com/pFHPdwE
http://imgur.com/qRDkoj6
http://imgur.com/a/lPqbx
http://imgur.com/xmmw9H0
http://imgur.com/ViCxsrS,lxpGIUQ#1
http://imgur.com/3RtgPGW
http://imgur.com/k2kzLZu
http://imgur.com/a/LTDJ9
http://imgur.com/KZqZncZ
http://imgur.com/a/xhfGD
In [111]:
linklis
Out[111]:
['//i.imgur.com/SBaV275.jpg',
 'data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7',
 '//i.imgur.com/pFHPdwE.jpg',
 'data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7',
 '//i.imgur.com/qRDkoj6.jpg',
 'data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7',
 '//i.imgur.com/PzRhHTr.jpg',
 '//i.imgur.com/FrlJauY.jpg']
In [115]:
if '.jpg' in linklis:
    print 'yes'
else:
    print 'no'
no
In [70]:
#panz()
for rdz in lisrgc:
    (rdz.title)
    #a(rdz.url)
    if 'http://i.imgur.com' in rdz.url:
        #print rdz.url
        print (rdz.url)
        url = rdz.url
        response = requests.get(url, stream=True)
        with open(str(rdz.author) + '-reference.png', 'wb') as out_file:
            shutil.copyfileobj(response.raw, out_file)
            del response
http://i.imgur.com/LPJyRvI.jpg
http://i.imgur.com/IXscmtj.jpg
http://i.imgur.com/HDkkpfs.jpg
http://i.imgur.com/ENDI3AG.jpg
http://i.imgur.com/P4ZXyZu.jpg
http://i.imgur.com/fTznCbi.jpg
http://i.imgur.com/aDIiH6t.jpg
http://i.imgur.com/6NifkhZ.jpg
http://i.imgur.com/EqnCVT7.jpg
http://i.imgur.com/vJgRQ2n.jpg
http://i.imgur.com/tMvP7jP.jpg
http://i.imgur.com/EEae4eN.jpg
http://i.imgur.com/SQeDd69.jpg
In [55]:
apsize = []
In [56]:
aptype = []
In [57]:
basewidth = 600
In [58]:
imgdict = dict()
In [59]:
for rmglis in os.listdir('/home/wcmckee/getsdrawndotcom/' + rmgzdays):
    #print rmglis
    im = Image.open(rmglis)
    #print im.size
    imgdict.update({rmglis : im.size})
    #im.thumbnail(size, Image.ANTIALIAS)
    #im.save(file + ".thumbnail", "JPEG")
    apsize.append(im.size)
    aptype.append(rmglis)
In [60]:
#for imdva in imgdict.values():
    #print imdva
    #for deva in imdva:
        #print deva
     #   if deva < 1000:
      #      print 'omg less than 1000'
       # else:
        #    print 'omg more than 1000'
         #   print deva / 2
            #print imgdict.values
            # Needs to update imgdict.values with this new number. Must halve height also.
In [61]:
#basewidth = 300
#img = Image.open('somepic.jpg')
#wpercent = (basewidth/float(img.size[0]))
#hsize = int((float(img.size[1])*float(wpercent)))
#img = img.resize((basewidth,hsize), PIL.Image.ANTIALIAS)
#img.save('sompic.jpg')
In [62]:
#os.chdir(metzdays)
In [62]:
 
In [63]:
#for numz in apsize:
#    print numz[0]
 #   if numz[0] > 800:
#        print ('greater than 800')
#    else:
#        print ('less than 800!')
In [64]:
reliz = []
In [65]:
for refls in os.listdir('/home/wcmckee/getsdrawndotcom/' + rmgzdays):
    #print rmgzdays + refls
    reliz.append(rmgzdays + '/' + refls)
In [66]:
reliz
Out[66]:
['imgs/15/01/04/reference/clawz_nd_webz-reference.png',
 'imgs/15/01/04/reference/Jasperthecat77-reference.png',
 'imgs/15/01/04/reference/Xrayguy104-reference.png',
 'imgs/15/01/04/reference/trippedwire-reference.png',
 'imgs/15/01/04/reference/herooftime94-reference.png',
 'imgs/15/01/04/reference/OhDeBabies-reference.png',
 'imgs/15/01/04/reference/yumyumyoshi-reference.png',
 'imgs/15/01/04/reference/thaisun-reference.png',
 'imgs/15/01/04/reference/Resrey-reference.png',
 'imgs/15/01/04/reference/SinisterCanuck-reference.png',
 'imgs/15/01/04/reference/Marsinator-reference.png',
 'imgs/15/01/04/reference/zakkalaska-reference.png',
 'imgs/15/01/04/reference/jazzyghost-reference.png',
 'imgs/15/01/04/reference/Reptilebear-reference.png',
 'imgs/15/01/04/reference/WesternWaterTribe-reference.png',
 'imgs/15/01/04/reference/AidenXY-reference.png',
 'imgs/15/01/04/reference/mrsmomo-reference.png',
 'imgs/15/01/04/reference/SpaceFeline-reference.png',
 'imgs/15/01/04/reference/Jabald69-reference.png',
 'imgs/15/01/04/reference/harisshahzad98-reference.png',
 'imgs/15/01/04/reference/TinyB1-reference.png',
 'imgs/15/01/04/reference/crackettt-reference.png',
 'imgs/15/01/04/reference/seahorseVT-reference.png',
 'imgs/15/01/04/reference/the_master_blaster-reference.png',
 'imgs/15/01/04/reference/baconbreeder-reference.png']
In [67]:
aptype
Out[67]:
['clawz_nd_webz-reference.png',
 'Jasperthecat77-reference.png',
 'Xrayguy104-reference.png',
 'trippedwire-reference.png',
 'herooftime94-reference.png',
 'OhDeBabies-reference.png',
 'yumyumyoshi-reference.png',
 'thaisun-reference.png',
 'Resrey-reference.png',
 'SinisterCanuck-reference.png',
 'Marsinator-reference.png',
 'zakkalaska-reference.png',
 'jazzyghost-reference.png',
 'Reptilebear-reference.png',
 'WesternWaterTribe-reference.png',
 'AidenXY-reference.png',
 'mrsmomo-reference.png',
 'SpaceFeline-reference.png',
 'Jabald69-reference.png',
 'harisshahzad98-reference.png',
 'TinyB1-reference.png',
 'crackettt-reference.png',
 'seahorseVT-reference.png',
 'the_master_blaster-reference.png',
 'baconbreeder-reference.png']
In [68]:
opad = open('/home/wcmckee/ad.html', 'r')
In [69]:
opred = opad.read()
In [70]:
str2 = opred.replace("\n", "")
In [71]:
str2
Out[71]:
'<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script><!-- header --><ins class="adsbygoogle"     style="display:inline-block;width:970px;height:250px"     data-ad-client="ca-pub-2716205862465403"     data-ad-slot="3994067148"></ins><script>(adsbygoogle = window.adsbygoogle || []).push({});</script>'
In [72]:
doc = dominate.document(title='GetsDrawn')

with doc.head:
    link(rel='stylesheet', href='style.css')
    script(type ='text/javascript', src='script.js')
    str(str2)
    
    with div():
        attr(cls='header')
        h1('GetsDrawn')
        p(img('imgs/getsdrawn-bw.png', src='imgs/getsdrawn-bw.png'))
        #p(img('imgs/15/01/02/ReptileLover82-reference.png', src= 'imgs/15/01/02/ReptileLover82-reference.png'))
        h1('Updated ', strftime("%a, %d %b %Y %H:%M:%S +0000", gmtime()))
        p(panz)
        p(bodycom)
    
    

with doc:
    with div(id='body').add(ol()):
        for rdz in reliz:
            #h1(rdz.title)
            #a(rdz.url)
            #p(img(rdz, src='%s' % rdz))
            #print rdz
            p(img(rdz, src = rdz))
            p(rdz)


                
            #print rdz.url
            #if '.jpg' in rdz.url:
            #    img(rdz.urlz)
            #else:
            #    a(rdz.urlz)
            #h1(str(rdz.author))
            
            #li(img(i.lower(), src='%s' % i))

    with div():
        attr(cls='body')
        p('GetsDrawn is open source')
        a('https://github.com/getsdrawn/getsdrawndotcom')
        a('https://reddit.com/r/redditgetsdrawn')

#print doc
In [73]:
docre = doc.render()
In [74]:
#s = docre.decode('ascii', 'ignore')
In [75]:
yourstring = docre.encode('ascii', 'ignore').decode('ascii')
In [76]:
indfil = ('/home/wcmckee/getsdrawndotcom/index.html')
In [77]:
mkind = open(indfil, 'w')
mkind.write(yourstring)
mkind.close()
In [78]:
#os.system('scp -r /home/wcmckee/getsdrawndotcom/ wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom')
In [79]:
#rsync -azP source destination
In [80]:
#updatehtm = raw_input('Update index? Y/n')
#updateref = raw_input('Update reference? Y/n')

#if 'y' or '' in updatehtm:
#    os.system('scp -r /home/wcmckee/getsdrawndotcom/index.html wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/index.html')
#elif 'n' in updatehtm:
#    print 'not uploading'
#if 'y' or '' in updateref:
#    os.system('rsync -azP /home/wcmckee/getsdrawndotcom/ wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/')
In [81]:
os.system('scp -r /home/wcmckee/getsdrawndotcom/index.html wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/index.html')
Out[81]:
0
In [553]:
#os.system('scp -r /home/wcmckee/getsdrawndotcom/style.css wcmckee@getsdrawn.com:/home/wcmckee/getsdrawndotcom/style.css')
In [553]:
 
In [321]:
 
In [138]:
 
In [138]:
 
In [ ]:
 

pygitx

Python script to download repos from github. It checked localfiles and skips downloading any repos that currently exsist. Perform a git pull for each repo that exsists

In [50]:
from github import Github
In [51]:
import os
import getpass
from git import *
import git
In [52]:
p = open('/home/wcmckee/ps.txt', 'r')
In [53]:
#str(p.read())
In [54]:
pred = str(p.read())
In [55]:
g = Github('wcmckee', 'test')
In [56]:
grepo = g.search_users('wcmckee')                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   
In [57]:
for gre in grepo:
    print gre.repos_url
https://api.github.com/users/wcmckee/repos
In [58]:
print gre.email
will@artcontrol.me
In [59]:
grrep = gre.get_repos()
In [60]:
replist =  list(grrep)
In [61]:
repolisx = []
repocom = []
In [62]:
for repoz in replist:
    print repoz.name
    repolisx.append(repoz.name)
    print repoz.size
    print repoz.updated_at
  #  repocom.append(repoz.get_commits)
4chan-API
239
2013-09-24 10:20:16
adypy
684
2014-06-09 04:52:12
anim_framework
260
2014-05-07 21:25:20
art-lastfm
636
2014-01-31 01:21:57
artcontrol-api
2012
2014-09-26 09:58:43
AsylumJamCrushGame
146482
2013-10-12 18:40:22
autopaintpy123
6176
2014-02-07 07:55:31
bacongamejam05
64
2013-12-18 14:12:00
battlenet
60
2013-10-16 14:06:51
bbc2wp
148
2014-09-21 18:35:09
bgj05-hungry
11316
2014-03-23 21:09:43
BirdCage
43209
2014-04-13 21:05:50
brobeur-static
2896
2014-05-05 02:02:45
brobeur-web
1798
2014-04-29 08:46:54
BroBeurFishingDream
5678
2014-02-16 19:19:16
broLove
108
2014-04-22 16:53:58
clint
848
2013-08-30 11:26:48
cobal.eportfolio
1361
2013-10-10 06:10:22
compohub
5030
2014-05-03 11:06:39
DailyProgBot
74
2013-08-30 11:26:48
DeltaBot
170
2013-08-30 11:26:48
dockerfiles
10055
2014-07-12 19:51:01
flask
9159
2013-12-18 05:05:44
flask-ghpages-example
76
2014-07-02 00:14:33
ggj14
50697
2014-01-26 02:48:08
hackage-server
5852
2013-09-30 07:50:04
hamiltoncomputerclub.org.nz
536
2014-05-13 00:12:55
haskell-learn
132
2014-02-05 10:22:41
HTML5-Tiny
10442
2014-05-22 03:16:18
html5lib-python
6911
2013-10-15 05:55:33
iamagamer
6755
2014-05-29 01:06:20
ihaskell-notebook
466
2013-10-01 22:09:52
imgur-python
220
2014-10-07 18:14:23
intercity
716
2014-05-12 12:56:13
intro_programming
8184
2013-12-06 05:32:29
ipython
296
2013-09-14 13:21:02
ipython-docker
4172
2014-06-10 00:29:33
ipython-hydra
70
2014-07-11 17:19:35
LD48-Skins
152
2013-12-19 20:47:10
marshmallow
422
2013-11-17 12:37:13
massive-wright
112
2013-10-29 06:30:14
Minecraft
17273
2014-05-06 19:19:03
miniprojects
87
2013-08-30 11:26:46
molyjam2013
27370
2013-08-30 11:26:46
niketa
160
2014-06-27 03:18:50
opencompetencies
778
2013-12-13 00:12:34
openfaux-server
209
2013-12-18 10:19:54
ouya-unity-plugin
26723
2014-09-10 19:08:05
paintingautopy
104
2014-02-04 12:01:36
pithos
880
2013-10-06 12:06:46
prodo-game
5208
2014-05-05 21:07:36
Projects
1712
2013-08-30 11:26:48
puphpet
11404
2014-07-11 03:36:52
pyatakl
1928
2014-06-29 01:24:11
pybackupz
196
2014-01-30 16:55:31
pyladies
5186
2014-04-26 22:23:20
pymel
175471
2013-11-17 11:45:49
python-guide
538
2013-08-30 11:26:45
python-pandora
120
2013-08-30 11:26:48
python-patterns
90
2013-08-30 11:26:47
pytm
64
2013-09-10 03:06:24
pyunderdog
188
2014-02-02 14:35:35
pywp-post
212
2014-06-02 07:16:53
raspi
128
2014-01-28 08:29:32
RasPiWrite
198
2014-01-20 18:31:17
RedditPress
356
2014-04-22 16:53:58
reddit_bot
114
2013-08-30 11:26:48
redtube
215
2014-05-07 20:52:47
render-useful
340
2014-11-14 08:56:35
RPIO
2471
2014-01-28 10:28:53
selfspy
235
2014-04-21 10:06:55
SketchDaily-GestureDrawing
5077
2014-05-22 19:15:06
skins
220
2014-02-15 12:56:01
SortPictures
140
2014-02-12 13:51:53
SoundWall
1958
2014-05-03 10:38:30
SuburbNightmare
64368
2014-04-17 06:02:40
te-whare-o-te-ata
464
2014-06-24 08:34:54
TPB
387
2013-10-04 11:50:53
Triger
18723
2014-02-18 16:10:27
tweepy
727
2013-08-30 11:26:47
UnityLuaIntegration
246
2013-08-30 11:26:47
vagrantboxes-heroku
2447
2014-07-11 13:02:20
vIPer
855
2014-05-11 06:17:26
wcmckee
1500
2014-12-19 10:27:44
wcmckee-notebook
29716
2014-09-02 11:51:52
wcmckee.com
584
2014-05-13 00:30:55
wcmckee.github.io
140
2014-04-27 12:28:03
wcmStringPY
1600
2014-02-02 17:24:08
wirepil
2152
2014-02-14 10:30:42
xboxdrv
64
2013-10-02 11:22:40
In [63]:
#for repz in repocom:
#    print repoz.ssh_url
In [64]:
for repoit in repolisx:
    print repoit
4chan-API
adypy
anim_framework
art-lastfm
artcontrol-api
AsylumJamCrushGame
autopaintpy123
bacongamejam05
battlenet
bbc2wp
bgj05-hungry
BirdCage
brobeur-static
brobeur-web
BroBeurFishingDream
broLove
clint
cobal.eportfolio
compohub
DailyProgBot
DeltaBot
dockerfiles
flask
flask-ghpages-example
ggj14
hackage-server
hamiltoncomputerclub.org.nz
haskell-learn
HTML5-Tiny
html5lib-python
iamagamer
ihaskell-notebook
imgur-python
intercity
intro_programming
ipython
ipython-docker
ipython-hydra
LD48-Skins
marshmallow
massive-wright
Minecraft
miniprojects
molyjam2013
niketa
opencompetencies
openfaux-server
ouya-unity-plugin
paintingautopy
pithos
prodo-game
Projects
puphpet
pyatakl
pybackupz
pyladies
pymel
python-guide
python-pandora
python-patterns
pytm
pyunderdog
pywp-post
raspi
RasPiWrite
RedditPress
reddit_bot
redtube
render-useful
RPIO
selfspy
SketchDaily-GestureDrawing
skins
SortPictures
SoundWall
SuburbNightmare
te-whare-o-te-ata
TPB
Triger
tweepy
UnityLuaIntegration
vagrantboxes-heroku
vIPer
wcmckee
wcmckee-notebook
wcmckee.com
wcmckee.github.io
wcmStringPY
wirepil
xboxdrv
In [65]:
repolisx
Out[65]:
[u'4chan-API',
 u'adypy',
 u'anim_framework',
 u'art-lastfm',
 u'artcontrol-api',
 u'AsylumJamCrushGame',
 u'autopaintpy123',
 u'bacongamejam05',
 u'battlenet',
 u'bbc2wp',
 u'bgj05-hungry',
 u'BirdCage',
 u'brobeur-static',
 u'brobeur-web',
 u'BroBeurFishingDream',
 u'broLove',
 u'clint',
 u'cobal.eportfolio',
 u'compohub',
 u'DailyProgBot',
 u'DeltaBot',
 u'dockerfiles',
 u'flask',
 u'flask-ghpages-example',
 u'ggj14',
 u'hackage-server',
 u'hamiltoncomputerclub.org.nz',
 u'haskell-learn',
 u'HTML5-Tiny',
 u'html5lib-python',
 u'iamagamer',
 u'ihaskell-notebook',
 u'imgur-python',
 u'intercity',
 u'intro_programming',
 u'ipython',
 u'ipython-docker',
 u'ipython-hydra',
 u'LD48-Skins',
 u'marshmallow',
 u'massive-wright',
 u'Minecraft',
 u'miniprojects',
 u'molyjam2013',
 u'niketa',
 u'opencompetencies',
 u'openfaux-server',
 u'ouya-unity-plugin',
 u'paintingautopy',
 u'pithos',
 u'prodo-game',
 u'Projects',
 u'puphpet',
 u'pyatakl',
 u'pybackupz',
 u'pyladies',
 u'pymel',
 u'python-guide',
 u'python-pandora',
 u'python-patterns',
 u'pytm',
 u'pyunderdog',
 u'pywp-post',
 u'raspi',
 u'RasPiWrite',
 u'RedditPress',
 u'reddit_bot',
 u'redtube',
 u'render-useful',
 u'RPIO',
 u'selfspy',
 u'SketchDaily-GestureDrawing',
 u'skins',
 u'SortPictures',
 u'SoundWall',
 u'SuburbNightmare',
 u'te-whare-o-te-ata',
 u'TPB',
 u'Triger',
 u'tweepy',
 u'UnityLuaIntegration',
 u'vagrantboxes-heroku',
 u'vIPer',
 u'wcmckee',
 u'wcmckee-notebook',
 u'wcmckee.com',
 u'wcmckee.github.io',
 u'wcmStringPY',
 u'wirepil',
 u'xboxdrv']
In [69]:
homlaz = ('/home/wcmckee/pygitx')
In [71]:
os.chdir(homlaz)
In [72]:
#opgitp = open('gitp.txt', 'r')
In [73]:
#_LOKDD
In [74]:
#rpa
In [75]:
dirlis = os.listdir(homlaz)
In [76]:
dirme = set(dirlis) - set(repolisx)
In [77]:
dirout = set(repolisx) - set(dirlis)
In [78]:
for dirz in dirout:
    print dirz
RasPiWrite
wcmckee.github.io
ggj14
flask
4chan-API
BroBeurFishingDream
art-lastfm
pywp-post
flask-ghpages-example
TPB
autopaintpy123
broLove
wcmckee-notebook
wirepil
dockerfiles
SketchDaily-GestureDrawing
SoundWall
te-whare-o-te-ata
SortPictures
cobal.eportfolio
intro_programming
python-guide
pyunderdog
skins
ouya-unity-plugin
adypy
pybackupz
selfspy
niketa
Projects
wcmckee
bgj05-hungry
BirdCage
iamagamer
ipython
redtube
imgur-python
paintingautopy
brobeur-web
molyjam2013
prodo-game
reddit_bot
DeltaBot
marshmallow
ipython-hydra
compohub
pyladies
python-patterns
hackage-server
pithos
bbc2wp
vIPer
Triger
RedditPress
tweepy
pytm
anim_framework
HTML5-Tiny
artcontrol-api
miniprojects
SuburbNightmare
raspi
UnityLuaIntegration
ipython-docker
intercity
puphpet
DailyProgBot
pymel
brobeur-static
clint
bacongamejam05
wcmckee.com
wcmStringPY
RPIO
html5lib-python
render-useful
vagrantboxes-heroku
python-pandora
LD48-Skins
hamiltoncomputerclub.org.nz
battlenet
haskell-learn
Minecraft
openfaux-server
AsylumJamCrushGame
xboxdrv
massive-wright
ihaskell-notebook
opencompetencies
pyatakl
In [79]:
dirme
Out[79]:
set()
In [80]:
merglis = set(dirlis) & set(repolisx)
In [81]:
merglis   
Out[81]:
set()
In [ ]:
for repoit in repolisx:
    print repoit
    os.system('git clone https://github.com/wcmckee/' + repoit)
    #git.Git().clone('https://github.com/wcmckee/' + repoit)
4chan-API
adypy
anim_framework
art-lastfm
artcontrol-api
AsylumJamCrushGame
autopaintpy123
bacongamejam05
battlenet
bbc2wp
bgj05-hungry
BirdCage
brobeur-static
brobeur-web
BroBeurFishingDream
broLove
clint
cobal.eportfolio
compohub
DailyProgBot
DeltaBot
dockerfiles
flask
flask-ghpages-example
ggj14
In [83]:
wcmrepo = git.repository('/home/wcmckee/wcmkee')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-83-fe34f7ddb807> in <module>()
----> 1 wcmrepo = git.repository('/home/wcmckee/wcmkee')

TypeError: 'module' object is not callable
In [84]:
help('git')
Help on package git:

NAME
    git - # -*- coding: utf-8 -*- ex:set ts=4 sw=4 et:

FILE
    /usr/local/lib/python2.7/dist-packages/git/__init__.py

PACKAGE CONTENTS
    config
    gitbinary
    misc
    objects
    repository


In [84]:
 
In [85]:
import git
In [86]:
from git import *
repo = Repo("/Users/mtrier/Development/git-python")
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-86-497077d73790> in <module>()
      1 from git import *
----> 2 repo = Repo("/Users/mtrier/Development/git-python")

NameError: name 'Repo' is not defined
In [87]:
git.Repository.clone('https://github.com/wcmckee/BeOk')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-87-66ad7af6c29e> in <module>()
----> 1 git.Repository.clone('https://github.com/wcmckee/BeOk')

TypeError: unbound method clone() must be called with Repository instance as first argument (got str instance instead)
In [88]:
git.repository.Repository.clone()
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-88-4495727d2e19> in <module>()
----> 1 git.repository.Repository.clone()

TypeError: unbound method clone() must be called with Repository instance as first argument (got nothing instead)
In [89]:
(git.objects)
Out[89]:
<module 'git.objects' from '/usr/local/lib/python2.7/dist-packages/git/objects.pyc'>
In [ ]:
 

wain

Visitor Sign System

This script was created as an altenative to a printed spreadsheet that you use a physical pen to write - date, name, reason, time in, time out. There is a comment book that this could be used to replace also. Currently people write comments in pen on lined refil. Comments The script is basic. It saves the time, date, name, reason, and added in comments as an html page and json object. Archieve of past logins are saved. It is split into two scripts. The first wain or whaxlu is used for the login. The second script is whaiout which is the logout script. It takes the output of wain or whaxlu and appends logout data - time, date, and comment.

Currently I use the script to sign in and out at a early learning centre that I am employed to work 2 hours a day, 5 days a week. It could be applied in any area where you want to keep a record of time, comments,

Similarity to twitter.

Avatars

Mention, taging, hashtags

There is often a line around the sign in/out book.

Generates summery

The site was live. It was running on a simple python web server but it worked. The script was exacuted by the Raspberry Pi B+ computer running Debian. Red pixels appeared on the screen, or #543243 if you want to be fussy. Luck had nothing to do with it. Testing was the key. 22 The text had rendered.

  1. Alfred Bunnings
  2. 09:00
  3. 01/Nov/2014
  4. InfoSec
  5. everything seems fine

Charlie MacDonald sat at his desk steering at the monitor. 'Alfred Bunnings, who is this guy?', he thought to himself. 'Jones, get over here and see this', Charlie yelled out to his Assistent. 'Coming Sir!', a voice sharply replied. Into Charlies office walked Jones. Tall, 6 foot 3. Dark Brown hair, curly and down to his shoulders. On this day his hair was tied up, usually he left it down but today was special. 'What's up sir', Jones asked, squinting at the monitor. 'Take a look for yourself', replied Charlie as he stood up. Jones took a seat at Charlies computer station. Charlie paced up and down his office. 'Sir, I've seen nothing like this before. What the hell is going on?', Jones asked Charlie. Standing still Charlie spoke: 'It's a god damn nightmare that is, this Alfred character has breached the networks. We need to find them.'

Alfred sat on the bus. The same seat he had every day. The bus was old, 50 years old and rust was pealing rom the rail. At least the button still works. He was on his way to visit an old friend, he hasn't seen in five years. Normally lfred drives his car but today was different. He felt like relaxing on the buss. His Dell laptop sat on his lap. The desktop enviorment was KDE, running Debian. He dudn't have much running. Two web browser windows, a terminal, and Pithos. On the left split was his IPython Notebook (running locallly), on the right split documation web page that he is following. The terminal window was behind the web page window. A TMUX session was running. The was split into three windows. Inside the sessoion was The IPython Notebook Kernal running on port 80. One the second window - Motion. This is used to capture save mpeg files from the Dell laptop camera.

The Streets were playing on Pithos. Stay Positive. Pithos is a desktop client for Pandora Radio.

SPIKE INFOSEC

The sign read. Alfred stood looking up above him. The skyscraper went on for miles. Squinting he could see the peak.

Today he was here for a job interview. Two days ago he had been sent a message that a position had came up and the company was interested in hiring him.

The automate doors opened as he approched them. Inside stood three security guards and a metal detotor. 'Name and reason/person visit', one of the guards asked him. 'Alfred Bunnings, I'm here for a job interview with Janet Pears'. The guard typed characters into the comput 'Alright, that seems fine sir', the guard replied'. 'Please make your way down the hall and take the second right on your left'. 'Thank you', Afred said as he begain walking down the hallway.

Everything was white. The ground, ceiling and floor. He passed the first turn off on the left and kept walking. He reached the second turn and turned left.

The colour scheme changed, colours filled the walls. Character and Landscapes scatted the walls. It was painted with a dark naples yellow, red, and a light blue. The backgrounds were blue and the characters and foreground landscapes were yellow and red.

Alfred admired the walls as he contined walking down the hallway. At the end of the hallway was a foylor with several desks setup with computers. A woman sat at the further one typing. The foylar was plainer than the hallway. Art murels still scatted the area, but they were less freuent. The are had a light blue background, with yellow and red character and landscape artwork. She looked up and smiled at Alfred. 'You must be Alfred', she exampled. Alfred approched her desk and held out he hand to shake it. 'I'm Janet', she said - shaking his hand. 'Take a seat Alfred'.

Alfred looked back at the sign on the building.

SPIKE INFOSEC

Twenty years he had worked there. Today he was moving on, had sold his shares of the company. He approched the doors. They opened automatitly. Three security guards stood there. All with very serious looks on their faces. Unlike the day he walked into the building for a job interview the guards were humanoild shaped robots. The guard looked and sounded like the same security guards on that first day. Under their fake human skin was just metal and wires.
'Morning fellas', Alfred said as he flashed his ID badge and walked down the corodilol. The corodor was still completely white, like the first time he entered the building. He walked down the hallway. Passing the first door on the left, and turning left at the second turn. The murals had changed. The colour scheme had stayed the same but the images had changed. Alfred had contrubited several times to the repainting of the murals. It was a annual event, each year five people are awarded with the job of recreating the murals. He wondered down the colourful hallway. During his third year working for Spike Infosec he won along with four other people - Noel Locksmart, Elizebeth Goodwill, Trist Gardens, and Ali Turbon. Individially they started by working on sketches. He reached the foylor - desks with computers scatted the foylor. This was the main room where people worked. It was an open space, and the desks had wheels on so they were able to be moved around the building freely. There were several breakoff rooms where smaller groups could go to work.

Alfred scanned the room. People filled the desks. Sitting or standing they typed. 'Today is special', a familar voice for Alfred sounds from behind a monitor. Janets face poped out, with a smile. 'Goodmorning Janet', Alfred smiled. Janet rose from her chair. 'Time for a coffee I believe', she said. Janet hadn't aged a day csince Alfred met her. She had long straiht brunette hair. Light complection skin, with frekles covering her face. She wore skinny horozonal glasses that had a light blue tint. On this day she wore a long silky dress. Her black leather jacket was spread out over the back of her computer chair. On her desk sat three wireless 22 inch monitors. One keyboard. One mouse. One drawing tablet. One tablet pen. Janet spent her days at work either coding or digital painting. She had the freedom to choose. If she didn't feel like painting that day, she could code instead. Her paintings were quick. Nothing took more than 45 mins. She was working on a logo. 'Spike InfoSec' it read. A furry spikey character held the sign. It's long sloopy tail weiving between the rocks on the ground. The software used for the digital painting was GIMP. Like all software developed and used by Spike InfoSec it is open source and free.

The dinosaur roared as Alfred held a large piece of steak. It's nose drew near and sniffed. 'Smells delicious', the dinosaur thought to itself. Alfred pushed the meat closer to the dinos nostrols. It spat green slime at Alfred and turned it's back and left in a huff. Jsnet hid in the bush nearby waiting for the dino to move out of sight. Alfred was covered in a green slime liquid. It had caused his body to go into shock and shutdown. Janet had to reach Alfred and reboot his system. She creeped closer to Alfreds still body. The dino sniffed the air. Janet stopped moving frozen. The dino continued walking. Janet rushed towards Alfred and grabbed his body. The Dino looked back and saw her moving away. He turned and bagan to give chase to Janet and Alfred. Scrabbambing up a tree Janet was safe from the dino. She had saved Alfred. The dino growed and circed the tree. It scratched the bark from the tree as it grew inpatienct. Janet reached for her backpack and revealed a cellphone. She dialed the number 6 3 2 1 4 5 1. Suddently the dino froze, it had been shutdown. Janet climbed down the tree, still holding onto Alfreds lifeless body. She stepped over the dinos body. Lifeless but the dino never was alive. It was another machine created to hunt her down. She had created them to help people, to make peoples lives easier. But they took her ideas and created evil. The Dark Shape was created by them. Janet never programmed her robots to kill people but others had taken the software and added kill commands. It had created great evil in the universe.

The planet Earth population had dropped to just a few thousand people in a matter of days. The machines released a toxin into the oxagin which caused the humans to choke to death. It was a quick death, but it spread quickly and was extremely contagious. Within an hour of the toxin being released half of the worlds population was dead. Over the next twenty four hour it wiped out 99.9% of mankind. Some people that held a special gene were immune. These people were scattedd about the world. Research was to be taken by a group of machines loyal to the humans which wanted to restore mankind. Working together with the few remaining humans they developed a device which allowed dispursion in time. The toxic grew complex and inserted itself into the dispursion of time and then spread the mutated toxic to people in the future, where they had found a cure. This caused the cycle to happen again - 99.9% of the world population was dead in under 24 hours. The remaining human had to find a way to give the cure to a large amount of humans without the machines noticing. They relized that having the machines help them in the cure was useless. They machines would spread their knowlege of the cure to enemy machines which would then develop a mutated toxic that the cure was useless against. The humans worked on traveling back with the cure and stopping the toxic waas being created in the first place. This is their story.

The ground that Janet walked on was rough. Stones cut into her barefeet. The sun blinded her view, she held her arm above her eyes. Attempting to block out the rays. A figure amurged in the distance.. Squinting she made haste towards the figure. As she drew closer she noticed the figure was a copy of herself. A clone. 'You must be Janet', Janet exampled. 'Yes, how do you know?', replied the machine. 'Because you are a clone of myself, and my name is Janet'. The machine stood still. 'Do you know why I am here', Janet asked the machine. 'Because you are afraid of the evils that you brought into this world', the machine spoke slowly. 'You are here to find a cure', the machine went on. 'But I will warn you young one, there is no cure here, only toxics', the machine collabsed to the ground covered in a green slime texture.

The number generater returned Noel Locksmart, Elizebeth Goodwill, Trist Gardens, and Ali Turbon. They were inserted into the text in order. Identitys were created. Credit Cards were the main source. Creating a fake identity and building up depit, and not paying it back. Janet scanned the card. Elizebeth Goodwill it the name on the license read. Born 06/11/2014. Spike Infosec: Help Desk 4. The photo was of a straight hair brunete in her early 20s, with frekles covering her face, petite. The room reveled a server room with walkways. Goodwill plugged the patch cable from the router. Connected on the otherside was a Raspberry Pi. A green and yellow light turned on by the ethernet port. It had a network. For development and testing Elizebeth perfered to use her own network connection. Truck skretched to a halt on the street. Three toots. Both truck doors opened and two men leaped out. They begain walking towards the propatery at the end of the street. Goodwill could hear several people moving about outside. She triggered her rewind time script. Time slipped back thirty mins. This gave her enough time to exit her home and make it to saftely before the assassians arrived again. The drone showed three people exiting the truck. Two men and a woman. As they scretched to a halt outside the house of Elizebeth Goodwill, was in fact Janet lived there. It was a setup. The house was trapped so that when they entered - they were never going to leave again. Thirty years Noel Locksmart had been a prisnor in the house. He will never forget the night. It was just a standard robbery. Locksmart had recieved word of a computer scientice by the name of Elizebeth Goodwill that had developed software that allowed the replaying of events in the past. The system was complex but started with just one script when Goodwill was just six years old. It was a login and logout script. The first script was the login script. This created a JSON object that included firstname, lastname, date (04/Nov/2014), time (hr:min), reason, and comment. The logout script added to the JSON object logout date, time, and comment. She wrote the script in the programming language Python. Her father - Charlie MacDonald had taught her from an early age the basics of programming. She enjoyed it so carried on. At a four year old she was involved in setting up a programming club for children. This idea spread to other towns, cities, and countries. Goodwill would travel with her Father and give demos and talks on their software development. Goodwill went on to co-founder the software development, infomation security, robotics design company: Spike InfoSec.

Ali Turbon wislied as he mixed the Indian spice together. Pandora radio was playing from his bluetooth speaker. It was on the family dinner table. Next to the speaker was a nonbranded laptop running Fedora. KDE was the desktop enviorment. Firefox was open with a IPython Notebook on the screen. This window was on the left side of the monitor. On the right top side of the monitor a television show played. It was George Benson played. 'Onions are ready to go, confirm? Y/n', a device in Turbon's left chefs jacket beeped. Turbon reached for his phone and hit enter. Three claws reached inside the cubord and picked up an onion each. The bench was a woodern bench. The claws dropped the onions on the bench then went back to the cubard to collect more onions. Several more trips back and forward then the claws switched into the next phase - cutting of the onion. The onions were lined up perfectly on the bench. Three rows, five collums. Fifteen onions. Chef knives snapped out of the claws of the device and started to cut up the onions. Each onion was cut in half, then quarters, and finally sliced. The onions were scopped up and droped into a pan of oil to fry. Once the onions were a lovely gold color they were tiped into a slow cooker. 'Carrots are ready to go, confirm? Y/n', the device The claws grabbed a orange carrot and dropped it on the bench. The same amount of carrots were used as onions. The knives cut the carrots by a slice lengthways in half, then quarters, and finally slicing. Scopped up by the claws they are dropped in a pan to be fried with oil. Once goldern the carrots are tipped into the slow cooker. Fifteen cups of soft chickpeas are tipped into the pan of oil and fried. They are tipped into the slow cooker. Fifteen medium sized tomatoes travel through the room, being griped by the claw. They are crushed and tipped into the slowcooker.
Five tea spoons of Indian spice mix is tipped into the slowcooker. 'Coconuts are ready to go, confirm? Y/n', beeped the device in Turbons hand. Enter was hit. The claws moved towards the cubard to retreave the coconuts. Seven medium sized ripe coconuts are dropped on the woodern bench. The knife snaps out of the claw and cuts the coconut clean in half. The claw snaps the knife away and revieles a sharp scoop. The white coconut insides are cut, scooped and tiped into the slowcooker. 'Item ready to cook. Is there anything else?', the device spoke. The voice was that of Trist Gardens. Her voice was sharp and clear. She spoke with confidence and joy. 'Fifteen garlic and salt', Turbon requested. 'Garlic are ready to go, confirm Y/n', the device beeped. Turbon slammed the enter key. The claws moved to the cubards and dragged out fifteen cloves of garlic. The knives snapped out and sliced it up finely. Garlic was scooped up and tipped into the slow cooker. 'Salt is ready to go, conf', before the device had finished beeping Tubon hit the enter key. Salt was tipped into the slow cooker. 'Item read to cook. Is there anything else?', Gardens voice asked. 'Confirm', Turbon confirmed. The claws began mixxing the slowcooker mixture together. Turbon had a late order for a chickpea curry that he was putting though last mintute. It was a Sunday night and he had a client that wanted a Chickpea Curry for lunch on Monday despritly. Normally he charges extra for weekend food orders but this night he made an exception. This client was special and he knew business would be better in the future because of it. A raspberry Pi and Audino powered his commucial resturont cooking device. His cellphone was connected to the raspberry pi computer that was connected to the Audino. The laptop was his portable devlopment machine. Turbon pressed a button a door opened. Server room with racks of servers running. small selfdriving cars moved up and down the hallway. The cars were careful to avoid Turbon as he walled down the hallway. He nodded at each car as they passed by. The cars had no wheels and moved along the ground by twisting. They moved along with the awkward twist movement. Rubberlike commpletion. Majority of the cars were a yellow and red colour scheme. Several had grayscale colour scheme and traveled in groups of three to five. The grayscale moved slower than the colour scheme and was quick to move out the way when one was heading towards them. Turbons reached a door. It was a large door, double his height. He banged at the door three times. The device in his pocket beeped, 'door status: open'. Turbon turned the handle and opened the door. He stepped into the room. Mid gray covered the room. A dark shape sat in the corner, reading a newspaper. Turbon paused as his eyes noticed the black shape. he was pulled closer and began walking towards the shape. He stood in front of the shape. 'Chickpea Curry is ready to go, confirm Y/n', Turbons spoke. A page of the newspaper was flicked over. Turbon waited, frozen. 'Confirm', a sound came from the black shape. The voice was that of Trist Gardens. She still continued to read the newspaper. Turbons dropped sixteen bags of curries. In each bag were 5 trays of curry. Eighty trays of curry in total. Every tray of curry also came with a tray of rice. Turbon dropped another sixteen bags. In each of these bags were 5 trays of rice. 'Enjoy your curry', Turbons said. 'Thank you', the voice of Gardens replied. The dark shape placed the newspaper beside them and stood up. They took the bags of curries and slowly walked towards the door with a large green exit sign at the end of the far wall. Turbons stood frozen untill he heard the exit door close behind the figure. He let out a sigh of relief. 'That went better than expected', Alfred spoke. 'Certainly sir, that was much calmer than last time', the voice of Gardens replied. Turbons had to be careful of these anonous bots that were being created. Last time he did an order of curries the bot attacked him and this lab. He had yet to figure out who sent the attack bot but had captured it for tests. He had increased secutiry in order to protect himself and business. Anon Bots that enter the building are now scanned with anti-malware software to test if it had code to attack. Turbons created a white list of bots that he allows to enter. Some of these do have attack code on their system but Turbons trusted the owner that they would not attack. 'Why would people want to attack a kitchen?', was the thought going through his brain. The Police arrived quickly after recieving the emergency signal. Detective Henry Pharrs was in charge of the case. Pharrs was in his late 50s, he had been in the force for 25 year. For 8 years before joining the force he worked as a computer analizt for Spike InfoSec. Today he was investagating a robots that opened fire in a room of humans. Robots created by humans were unable to kill other humans but the robots built robots that had the kill command in the system. Pharrs had studied the kill comand in robots his whole life. His father - Johns Pharrs had studied the kill command much of his life. His Son continued to reasearch the kill command. The goal was to find a cure.

The door slammed behind the dark shaped as it entered the building. Three security guards stood in front. He moved forward ignoring the guards. They said nothing and didn't move. The alarm went off as the dark shape passed through the dectoror. A guard hit a switch and the alarm stopped. The dark shape moved down the hallway reaching the second door on the left. It turned left. As it passed, the artwork on the walls turned from a bright primary colour scheme to a grayscale, black and white scheme. The Dark Shape left a bitter taste in the air. It reached the foylor. Desks weree empty. It began to wislle as it walked around the desks. The Dark Spaes head darted back and forward between desks, looking for something. It's head stopped moving - staring at an item on a desk it moved closer. An item sparkled from the desk. The black shape reached out and picked up the item and placed it inside their pocket. It turned around and walked out of the building. Shortly after The Dark Shape left the building, it explodes.

Fire engine roars down the road towards the building on fire. The Dark Shape walks down the country side as the engine passes him. Corn fields populate the farm land here. Flying drones fly above the corn - monitoring. The Dark Shape smiles as thir bare feet crunch on the grass. It feels better than the cloth feeing the building had. The Dark Shape turned and watch dark clouds of smoke emurging into the sky.

The engine arrived at the building. Much of the building was englufed in flame but the fighters would do their best to save it. Humanolid Robot jumped out of the fire engine and entered the building to check for human that may be still in the building. Two of the bots forced the front door open and entered the building. Three security guards stood there englufed in flames. They just beeped gibbish as the firefighters soaked them in anti flame powdered. The flames died down. The fighters contained down the hallway, spraying the ani flame powder as they moved. One fighter opened the first door on the left. He entered a large foylor area. Desktop with computers on were englufed in flames. The fighter began to spray the powder at the computer. He noticed a humon figure in the room. It was sitting in the far corner sitting at the desk typing on a keyboard. 'Must be a bot, can't be human', the fighter thought to himself. He moved closed to the humanold typing. The fact it didn't stopped typing is strange. Usually an emergeny mode activates when the bots are in danger. The activates a call to authorities and begins a self-repair task. The fighter sprayed the bot with anti flame powder. It continued to type.

Charlie MacDonald. Rest in Peace. The grave stone read. Born 24 October 2014. Died 12 December 2093. Trist Gardens stood over the grave. Tears flowing from her eyes. It had been two months since MacDonald passed away. She had worked with him for fifteen years at Spike InfoSec. Together they worked on a automatic cooking device. It was used in billions of home and businesses around the world. Spike InfoSec realeased blueprints of the device so anyone with a 3d printer can just print the device.

Drone hovered by MacDonalds head. The claw was attached to a woodern board. MacDonald placed a onion on the board. 'Cut', he spoke clearly. 'Confirm', the device beeped back. A knive snapped out of the claw and sliced the onion in half. A smile rose on MacDonalds face. 'Jones, get over here and see this'. The door opened and Jones walked towards MacDonald. 'What is it sir?', Jones asked. 'Watch this Jones', MacDonald said as he placed a onion on the woodern board. 'Cut', he spoke once again. 'Confirm', the device deeped back. The knife snapped out. The onion was sliced clean in half. 'wonderful work sir', Jones smiled. 'Kill Jones', MacDonald spoke, his eyes fixed on Jones. The knife snapped out of the claw and started traveling towards Jones. Jones terrified started moving backwards towards the door. He turned and began running towards the door. The knife traveled faster through the air.Janet Pears A cracking sound shoock the room as the knife impayed the brain of Jones. His head was attached to the door. Part of the knife was impayed into the door. It was a instant death. MacDonald didn't have a choice. He was on orders from Trist Gardens. If he didn't follow through and kill his assistant his whole families lives were in danger. 'Clean this up', MacDonald spoke as he walked out of his office. He stood in the foylor. Light blue covered the area. MacDonald could see Janet in distance. She was interviewing a new comer. Alfred Bunnings his name was. MacDonald watched as Janet and Alfred shook hands and took a seat. 'He looks like a good test subject', MacDonad spoke. 'Test subject: Alfred Bunnings', the device beeped back. 'Confirm', MacDonald confirmed. MacDonald continued to watch Janet and Alfred as they talked. 'So tell me about yourself', Janet had the first question. MacDonald had known Janet since she was a toddler. He would push her on the swings in the local park. MacDonalds daughter Elizebeth Goodwill would also be swinging. The Black Cat sat and watched the swings, following the figures in the seats as they moved backwards and forward. 'Black', a voice behind called and The Black Cat twisted around. The Black Cat moved towards the voice. A dark black shape stood in the distance pouring fresh meat into the cats contaiiner. 'Thanks ma'am', The Black Cat spoke, he began eating the meat. The Black Cat was a prototype for the humonald bot. It was used for testing. Under the fake fur and skin was metal and wires. A raspberry Pi computer inside the model with a cell phone and audino connected to that. Human like personioltys were added to the computer. The speech The park ranger - Alfred Bunnings had created this one. He had printed the parts and soultered the robot together. Installing the software was easy - it was all in the software store. Bunnings also worked on scripts for Black. Today he was testing a script that made the robot smell and taste food. As people were pushing their children on the swing he ordered for a humonald robot to activate in the park and send off a signal to attract Black back. This script could be used in many areas but Bunnings was developing it so he could find lost or stolen robots. Many times people come into the Park and walk away with a robot. Sometimes I don't relize it. They thing they have made a new friend but really it's a computer belonging to the park. The Park gets an alert that a bot has entered an unauthorised area and sends back gps data. The Park gives the person a friendly phone call or email and offers to either pick the robot up or let the person return the robot themselves. As long as the robot is returned the Park is not worried. Once the robot is returned it's rebooted in order to clear memories of leaving the park. The Park does not want robots remembering what is outside The Park as it causes more robots wanting to leave the park. Rebooting them isn't a problem but there is better work that the staff could be working on - such as new features. Black purred as he ate the fresh meat from his bowl. The little girl on the swing with straight brunette hair hoped off the swing and ran toward Black. He purred louder as the girl - Elizebeth Goodwill patted Black. The adult continued to push the other child - a girl with blonde hair and a darker completion than Goodwill. The Farm had rows of vegetables growing. On the pathway around the veges walked humonald robots working on the garden. Bunnings walked past the onion gardens. He smiled at the group of humanold robots in suits discussing the Onion garden. 'We need to speed haviousting up by 150% in the next three weeks', Bunnings overheard them as he walked past. The Onion Gardens were thriving. A cold and wet June month meant rain every day. The claw pulled the onion and dropped it into the crate with the rest of the onions that had been havested. The Claw moved fast, pulling another onion out of the ground in less than two seconds. Bunnings whisiled a tune as he walked past the Onion Gardens. Harvesting was excellent with the onions this year. They were experiencing a demand in onions and had increased the amount of The Claw that worked on the gardens. Bunnings entereded the carrot gardens. These parsely like vegetables added sweetness to the companies curries. Bunnings watched a The Claw plucked the carrots from the ground and dropped them in large containers. The next step for the carrots was washing, then it entered the kitchen. Bunnings entereed the tomatoe gardens. Rich red tomatoes grew on the stalk. In this garden the company used humonald robots as a majority over the claw. The humonalds were given custom arms that allows for careful picking of the tomatoes. The tomatoes were dropped into containers where they were washed and moved into the kitchen. The herb gardens were next. Here coroondir and other herbs were grown for the Indian Spice Mix.

The Resturent was where customers could come and eat, or take away curry. Open 24hour/7days. They also offered a delivery service. Humanold robots greeted the customers and showed them to their seat. If they were takeaway they offered a room to sit and read magazines/books. A room that is filled with computers - these are running 24 hours and offer swipe card assess. Security cameras and drones operate in the building to keep an eye on everything. Secuirity drones analze the cameras for any unusual activity. Tonight they had found a human bot spiking the food with toxin. Camera 532 had spoted the crime. A Claw drone spoted the activity on the monitor. Alarms had already activated. Secutiry drones and bots headed towards the area. The human bot that was spiking the food activated a self return to base mode. It caused the bot to turn invisible and extremely tiny - the size of a mouse. The bot was still able to travel at fast speeds but it was also small like a mouse, and invisible. Devices had been developed that allowed for tracking of invisible units. The Claw Drones had invisible cloaking devices build in. This allowed them to become invisible for up to 48 hours. They can enter a sleep mode that allows them to run for months without needed to land. They are able to charge themselves remotly by downloading charge off a connection. This allowed more advance space travel by allowing humans to send robots which could self repair and charge. This allowed robots leaving earth to a far away planet that took thousands of years to reach. The robots finally reached the planet and sent back message to earth on the news. It took the robots 8,500 years to reach the planet. The news took 2,500 years to arrive. The message reached Earth on December 18th 3543. The humans that sent the bots into space and to the planet had long since extint. Some of the older robots on Earth rememberd the mission and were able to decode the news file. It gave the robots great exciment that these robots had reached the planet and news had arrived back. The Robots of Earth created a time warp portal and linked to the robots on the planet. They ere able to travel between planets instantly. For the robots it felt like they had just woken up from a sleep. They work with the robots in order to get a toxic sent back in time to wipe out a certain group of humans that later develop a toxin that wipes out the human race. The toxin that is sent back is a success but spreads to forign targets - only activating again in 6 generations. When the toxin activates the nervious system in the throat stop working, causing the person to choke to death. A similar kill program was writen for the robots. It caused the robots harddrive to corrupt spreading the corruption to other systems. "I'm feeling better now", The Claw said to the humanold robot. The Claw was loading onions from the garden to the containers. These onions had been enjected with a toxin that would later cause much of the human race to die. The toxins had been placed in the onions by robots in the future with the help from a few remainings humans. Their plan was to wipeout a certain amount of humans, presuving human life in the future. Robots more in the future realized that this was a mistake and sent robots back in order to revirse the toxin. The only way for the toxin to not be created was to make sure the Spike InfoSec never waas founded. Humans were sent back in order to stop the foundation. Janet Goodwill was the number one target. The humans killed Goodwill many times but each time it was a clone. If the human form of Janet Goodwill is killed then the humans can insert the toxin into her system - sending it back generations. This simple exercise causes the company Spike InfoSec to never be founded. Alfred stood outside an empty building. He looked up. He remembers there was a Spike InfoSec sign hanging here. Instead it's a old video game store mixed with a coke logo, both fading and pealing off. The glass sliding doors didnt excist. A woodern door stood in front of Alfred. He turned the handle. It opened to a small foylor that looks like it was used for a small supermarket. Old broken trolies were scatted around the room. Cobwebs covered the room. A rat scatted accross the room. Followed by a black cat. Alfred walked down the hallway - it was filled with posters from ancient movies. They were faded and pealing off the room. Behind was a light blue textured wallpaper. the ground was a grey-blue carpet. It despritly needed a clean as dirt and grime covered it.

The humanolid robot monitored the children. Four children to every one humanold robot. The robots only monitored and did not interact with them. For every eight children one human teacher. Sixteen children, one support worker. It was all in the math. 113

How shell this day end?

on top of tht shr hda to decife to fire him for the leaking of private documents. her private key was broken

He had plans to create his own Infomation Security company.

TODO

html page of page 100 logins/logouts - append all json as one html different ways of viewing the data - current loged in, day view,

graph of weekly/day hours and how they change.

work out time between login and logout.

question that you comment on that includes the sketch daily reddit gets drawn.

fake name encription for kids/everyone? Take a name and turn into into a new name, run a script again to turn it back

records time when child eats/drinks. Keeps track of childrens hunger levels. Alerts if kids havnt aten after certain time. Tag what kids eat - data over time.

Sig for parents when signing in/out kids

txt/email alerts to parents on changes to kids signin/out

If someone doesnt signin/signout before certain time - alert

import pandas. Need to free up space on raspberry pi

Filter - new/popular comments, people, signins, signouts,

Date 01 November 2014

0000

Daily Rosted/Working chart. Spread sheet - names colum, time rows.

example: | Dino1 | Dino2 | Dino3 |

0700 | signin + comment!

0701 | signout + comment!

so on..

Live updates of the floor. Inside, Outside, Inside Float, Outside Float, General Float. Tagging. Tag people for breaks. eg - you are busy with a project, but need a break/lunch. Someone tags you and takes over at area. You go for break/lunch. When you return, tag back and carry on. The person that was just taged back moves onto the next person to tag for break/lunch. This person cycles through a group of people.

Live updates of child location. gps info of each area of centre and count how many children are there.

Panic Button. Signal to call for assistence, part-tag, allow for image/sound record.

In [1]:
import os
import time
import dominate
import sys
from dominate.tags import *
import json
import pandas as pd

Profile. Profile has data in it that is then used in sign system

In [2]:
valname = ('lasnam', 'signin', 'usercom', 'dayr', 'htmn')
In [2]:
 
In [3]:
for itzval in valname:
    print itzval
lasnam
signin
usercom
dayr
htmn

This function creates a dict, another updates it. Function is working to create it but update is coming back with error.

How do I refer to the created dict as the one to update?

In [4]:
class DictNows():

    def dictcreate(keyval, firnam):
        return dict({keyval: firnam})

    def updatedict(keyvalz, othnam):
        return dictcreate.update({keyvalz: othnam})
In [5]:
#checkdict = dictcreate('check', 'this')
In [6]:
#checkdict
In [7]:
def dictcreate(keyval, firnam):
    return dict({keyval: firnam})

#def updatedict(keyvalz, othnam):
#    return checkdict.update({keyvalz: othnam})

def returndate():
    return time.strftime(("%d" + "-" + "%b" + "-" + "%Y"))

def returntime():
    return time.strftime("%H:%M:%S")

def returan():
    return os.urandom(16)

#def blahblah():
    #open('/home/wcmckee/visignsys/posts/' + ixtwe + '.html', 'w')
    #savpos.write(str(doc))
    #savpos.close()
    
    
In [7]:
 
In [8]:
#updatedict('omg', 'not again')
In [9]:
returan()
Out[9]:
'\xb4\xd6\x9al\'\t\xfe\xb5|G["\x9d\x12\xa4~'
In [10]:
returntime()
Out[10]:
'19:37:49'
In [11]:
#DictNows.dictcreate('check')
In [12]:
dictcreate('name', 'wcm')
#updatedict()
Out[12]:
{'name': 'wcm'}
In [13]:
#updatedict('checking', 'this works')
In [14]:
newprof = raw_input('New Profile y/n: ')

if 'y' in newprof:
    lasnam = raw_input('Last Name: ')
    firnam = raw_input('First Name: ')
    dopz = raw_input('dob: ')
    mname = ('William Mckee')
    ename = raw_input('Email: ')
    signin = raw_input('Reason: ')
    usecom = raw_input('Comments: ')
    
elif 'n' in newprof:
    lasnam = ("mckee")
    firnam = ('First Name: ')
    dopz = ('dob: ')
    mname = ('William Mckee')
    ename = ('Email: ')
    signin = ('Reason: ')
    usecom = ('Comments: ')
New Profile y/n: n
In [15]:
#bitdict = 
In [16]:
betdict = dict()
In [17]:
#betdict.update({'lastname': lasnam})
In [18]:
dayr = time.strftime("%d" + "-" + "%b" + "-" + "%Y")
hrmn = time.strftime("%H:%M:%S")

# <codecell>

betdict.update({'last-name': lasnam})
betdict.update({'reason': signin})
betdict.update({'signin-comment': usecom})
betdict.update({'signin-date': returndate()})
betdict.update({'signin-hrmin': returntime()})
In [19]:
betdict
Out[19]:
{'last-name': 'mckee',
 'reason': 'Reason: ',
 'signin-comment': 'Comments: ',
 'signin-date': '19-Dec-2014',
 'signin-hrmin': '19:38:08'}
In [20]:
betjsn = json.dumps(betdict)
betjsn
Out[20]:
'{"signin-date": "19-Dec-2014", "reason": "Reason: ", "signin-comment": "Comments: ", "last-name": "mckee", "signin-hrmin": "19:38:08"}'
In [21]:
#for itz in updatedict():
#    print itz
In [22]:
opind = open('/home/wcmckee/visignsys/index.json', 'r')
opred = opind.read()
In [23]:
opred
Out[23]:
'{"signin hrmin": "15:09:51", "reason": "ESW", "firstname": "William", "signin comment": "sunny and mum is back", "lastname": "Mckee", "signin date": "02-Dec-2014"}{"signin-date": "09-Dec-2014", "reason": "esw", "signin-comment": "checking this on the bus", "last-name": "mckee", "signin-hrmin": "08:30:54"}{"signin-date": "10-Dec-2014", "reason": "eefwe", "signin-comment": "erw", "last-name": "mckee", "signin-hrmin": "01:16:44"}{"signin-date": "10-Dec-2014", "reason": "esw", "signin-comment": "blahblah", "last-name": "mckee", "signin-hrmin": "01:17:36"}{"signin-date": "17-Dec-2014", "reason": "Reason: ", "signin-comment": "Comments: ", "last-name": "mckee", "signin-hrmin": "23:02:46"}{"signin-date": "18-Dec-2014", "reason": "Reason: ", "signin-comment": "Comments: ", "last-name": "mckee", "signin-hrmin": "04:05:42"}'
In [24]:
opjsnd = json.dumps(opred)
In [25]:
str(opjsnd)
Out[25]:
'"{\\"signin hrmin\\": \\"15:09:51\\", \\"reason\\": \\"ESW\\", \\"firstname\\": \\"William\\", \\"signin comment\\": \\"sunny and mum is back\\", \\"lastname\\": \\"Mckee\\", \\"signin date\\": \\"02-Dec-2014\\"}{\\"signin-date\\": \\"09-Dec-2014\\", \\"reason\\": \\"esw\\", \\"signin-comment\\": \\"checking this on the bus\\", \\"last-name\\": \\"mckee\\", \\"signin-hrmin\\": \\"08:30:54\\"}{\\"signin-date\\": \\"10-Dec-2014\\", \\"reason\\": \\"eefwe\\", \\"signin-comment\\": \\"erw\\", \\"last-name\\": \\"mckee\\", \\"signin-hrmin\\": \\"01:16:44\\"}{\\"signin-date\\": \\"10-Dec-2014\\", \\"reason\\": \\"esw\\", \\"signin-comment\\": \\"blahblah\\", \\"last-name\\": \\"mckee\\", \\"signin-hrmin\\": \\"01:17:36\\"}{\\"signin-date\\": \\"17-Dec-2014\\", \\"reason\\": \\"Reason: \\", \\"signin-comment\\": \\"Comments: \\", \\"last-name\\": \\"mckee\\", \\"signin-hrmin\\": \\"23:02:46\\"}{\\"signin-date\\": \\"18-Dec-2014\\", \\"reason\\": \\"Reason: \\", \\"signin-comment\\": \\"Comments: \\", \\"last-name\\": \\"mckee\\", \\"signin-hrmin\\": \\"04:05:42\\"}"'
In [25]:
 
In [26]:
#json.load(opred)
In [26]:
 
In [27]:
# -*- coding: utf-8 -*-
# <nbformat>3.0</nbformat>

# <markdowncell>

# <h2>visitor sign system</h2>
# 
# This is a python script used to sign in and signout, keeping track of hours and creating a more automative system.
# 
# Make sign in and out faster, easier to keep track of.
# 
# Never forget. 
# 
# Auto roll check. 
# 
# Two random hex codes for security and correct checking. Made use of tthese by using one as file name when saving.
# 
# Creates xls file with data, also uses sqlalchemy for databases, web server, html page: 
# input (or auto) name, reason, auto day/month/year hr/min - of signin.
# 
# when launched asked if you want to signin or signout. 
# 
# how i want this to run for william:
# 
# william arrives into whai. On his phone he runs the signin script. On signing out for the day the script is run onto final part, signout. asks for comment first then records time, and date. 
# 
# comment system. leave comment for staff, parent, tag staff, area, story, parent, child.
# 
# signout - enter code of session you want to signout. 
# 
# Screw the excel file, im just dealing with index page. i am saving achieve in posts folder under urandom 13 character code. 

# <codecell>

doc = dominate.document(title='Visitor Sign Sheet')

with doc.head:
    link(rel='stylesheet', href='style.css')
    script(type='text/javascript', src='script.js')

with doc:
    with div(id='header').add(ol()):
        h1('Visitor Sign Sheet')
        for i in betdict.values():
            li(a(i))

    with div():
        attr(cls='body')
        p(opred)
        p('last updated: ' + time.strftime("%H:%M"))
        p('Visitor Sign Sheet is open source')
        a('http://github.com/wcmckee/wcmckee', href='https://github.com/wcmckee/wcmckee')

#print doc

# <codecell>

savindex = open('/home/wcmckee/visignsys/index.html', 'w')

# <codecell>

savindex.write(str(doc))
savindex.close()

# <codecell>
ixran = os.urandom(16)
ixtwe = ixran[0:16]

# <codecell>

savpos = open('/home/wcmckee/visignsys/posts/' + ixtwe + '.html', 'w')
savpos.write(str(doc))
savpos.close()

# <codecell>

savpos = open('/home/wcmckee/visignsys/posts/' + ixtwe + '.json', 'w')
savpos.write(str(betjsn))
savpos.close()

# <codecell>

#savpos = open('/home/wcmckee/visignsys/index.meta', 'w')
#savpos.write(str(wsdict.keys()))
#savpos.close()

# <codecell>

savpos = open('/home/wcmckee/visignsys/index.json', 'a')
savpos.write(str(betjsn))
savpos.close()

print ('sign in complete')
sign in complete
In [28]:
reser = pd.Series(opred)
In [29]:
#pd.DataFrame(reser)
In [30]:
rezda = []
In [31]:
for res in reser:
    print res
    rezda.append(res)
{
"
s
i
g
n
i
n
 
h
r
m
i
n
"
:
 
"
1
5
:
0
9
:
5
1
"
,
 
"
r
e
a
s
o
n
"
:
 
"
E
S
W
"
,
 
"
f
i
r
s
t
n
a
m
e
"
:
 
"
W
i
l
l
i
a
m
"
,
 
"
s
i
g
n
i
n
 
c
o
m
m
e
n
t
"
:
 
"
s
u
n
n
y
 
a
n
d
 
m
u
m
 
i
s
 
b
a
c
k
"
,
 
"
l
a
s
t
n
a
m
e
"
:
 
"
M
c
k
e
e
"
,
 
"
s
i
g
n
i
n
 
d
a
t
e
"
:
 
"
0
2
-
D
e
c
-
2
0
1
4
"
}
{
"
s
i
g
n
i
n
-
d
a
t
e
"
:
 
"
0
9
-
D
e
c
-
2
0
1
4
"
,
 
"
r
e
a
s
o
n
"
:
 
"
e
s
w
"
,
 
"
s
i
g
n
i
n
-
c
o
m
m
e
n
t
"
:
 
"
c
h
e
c
k
i
n
g
 
t
h
i
s
 
o
n
 
t
h
e
 
b
u
s
"
,
 
"
l
a
s
t
-
n
a
m
e
"
:
 
"
m
c
k
e
e
"
,
 
"
s
i
g
n
i
n
-
h
r
m
i
n
"
:
 
"
0
8
:
3
0
:
5
4
"
}
{
"
s
i
g
n
i
n
-
d
a
t
e
"
:
 
"
1
0
-
D
e
c
-
2
0
1
4
"
,
 
"
r
e
a
s
o
n
"
:
 
"
e
e
f
w
e
"
,
 
"
s
i
g
n
i
n
-
c
o
m
m
e
n
t
"
:
 
"
e
r
w
"
,
 
"
l
a
s
t
-
n
a
m
e
"
:
 
"
m
c
k
e
e
"
,
 
"
s
i
g
n
i
n
-
h
r
m
i
n
"
:
 
"
0
1
:
1
6
:
4
4
"
}
{
"
s
i
g
n
i
n
-
d
a
t
e
"
:
 
"
1
0
-
D
e
c
-
2
0
1
4
"
,
 
"
r
e
a
s
o
n
"
:
 
"
e
s
w
"
,
 
"
s
i
g
n
i
n
-
c
o
m
m
e
n
t
"
:
 
"
b
l
a
h
b
l
a
h
"
,
 
"
l
a
s
t
-
n
a
m
e
"
:
 
"
m
c
k
e
e
"
,
 
"
s
i
g
n
i
n
-
h
r
m
i
n
"
:
 
"
0
1
:
1
7
:
3
6
"
}
{
"
s
i
g
n
i
n
-
d
a
t
e
"
:
 
"
1
7
-
D
e
c
-
2
0
1
4
"
,
 
"
r
e
a
s
o
n
"
:
 
"
R
e
a
s
o
n
:
 
"
,
 
"
s
i
g
n
i
n
-
c
o
m
m
e
n
t
"
:
 
"
C
o
m
m
e
n
t
s
:
 
"
,
 
"
l
a
s
t
-
n
a
m
e
"
:
 
"
m
c
k
e
e
"
,
 
"
s
i
g
n
i
n
-
h
r
m
i
n
"
:
 
"
2
3
:
0
2
:
4
6
"
}
{
"
s
i
g
n
i
n
-
d
a
t
e
"
:
 
"
1
8
-
D
e
c
-
2
0
1
4
"
,
 
"
r
e
a
s
o
n
"
:
 
"
R
e
a
s
o
n
:
 
"
,
 
"
s
i
g
n
i
n
-
c
o
m
m
e
n
t
"
:
 
"
C
o
m
m
e
n
t
s
:
 
"
,
 
"
l
a
s
t
-
n
a
m
e
"
:
 
"
m
c
k
e
e
"
,
 
"
s
i
g
n
i
n
-
h
r
m
i
n
"
:
 
"
0
4
:
0
5
:
4
2
"
}
In [32]:
rezda
Out[32]:
['{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 ' ',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '1',
 '5',
 ':',
 '0',
 '9',
 ':',
 '5',
 '1',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'E',
 'S',
 'W',
 '"',
 ',',
 ' ',
 '"',
 'f',
 'i',
 'r',
 's',
 't',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'W',
 'i',
 'l',
 'l',
 'i',
 'a',
 'm',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 ' ',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 's',
 'u',
 'n',
 'n',
 'y',
 ' ',
 'a',
 'n',
 'd',
 ' ',
 'm',
 'u',
 'm',
 ' ',
 'i',
 's',
 ' ',
 'b',
 'a',
 'c',
 'k',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'M',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 ' ',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '2',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 '}',
 '{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '9',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'e',
 's',
 'w',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 'c',
 'h',
 'e',
 'c',
 'k',
 'i',
 'n',
 'g',
 ' ',
 't',
 'h',
 'i',
 's',
 ' ',
 'o',
 'n',
 ' ',
 't',
 'h',
 'e',
 ' ',
 'b',
 'u',
 's',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 '-',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'm',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '8',
 ':',
 '3',
 '0',
 ':',
 '5',
 '4',
 '"',
 '}',
 '{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '1',
 '0',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'e',
 'e',
 'f',
 'w',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 'e',
 'r',
 'w',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 '-',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'm',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '1',
 ':',
 '1',
 '6',
 ':',
 '4',
 '4',
 '"',
 '}',
 '{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '1',
 '0',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'e',
 's',
 'w',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 'b',
 'l',
 'a',
 'h',
 'b',
 'l',
 'a',
 'h',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 '-',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'm',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '1',
 ':',
 '1',
 '7',
 ':',
 '3',
 '6',
 '"',
 '}',
 '{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '1',
 '7',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'R',
 'e',
 'a',
 's',
 'o',
 'n',
 ':',
 ' ',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 'C',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 's',
 ':',
 ' ',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 '-',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'm',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '2',
 '3',
 ':',
 '0',
 '2',
 ':',
 '4',
 '6',
 '"',
 '}',
 '{',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'd',
 'a',
 't',
 'e',
 '"',
 ':',
 ' ',
 '"',
 '1',
 '8',
 '-',
 'D',
 'e',
 'c',
 '-',
 '2',
 '0',
 '1',
 '4',
 '"',
 ',',
 ' ',
 '"',
 'r',
 'e',
 'a',
 's',
 'o',
 'n',
 '"',
 ':',
 ' ',
 '"',
 'R',
 'e',
 'a',
 's',
 'o',
 'n',
 ':',
 ' ',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'c',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 '"',
 ':',
 ' ',
 '"',
 'C',
 'o',
 'm',
 'm',
 'e',
 'n',
 't',
 's',
 ':',
 ' ',
 '"',
 ',',
 ' ',
 '"',
 'l',
 'a',
 's',
 't',
 '-',
 'n',
 'a',
 'm',
 'e',
 '"',
 ':',
 ' ',
 '"',
 'm',
 'c',
 'k',
 'e',
 'e',
 '"',
 ',',
 ' ',
 '"',
 's',
 'i',
 'g',
 'n',
 'i',
 'n',
 '-',
 'h',
 'r',
 'm',
 'i',
 'n',
 '"',
 ':',
 ' ',
 '"',
 '0',
 '4',
 ':',
 '0',
 '5',
 ':',
 '4',
 '2',
 '"',
 '}']
In [33]:
len(rezda)
Out[33]:
825
In [33]:
 
In [32]:
 
In [ ]:
 

whaiout

This is the signout script that opens the xl file and fill in signout info.

Opens up list of signin data. This is date of sign in, time, name, and reason. This is .meta. This script appends sign out data. This is signout date, signout time, and comments.

8 urandom 128 keys are generated. Used these in saving the achieve, as .html, and .meta files.

creates date and time mark and asks for comment

This needs rewriten to remove xl stuff opening. keep it to json and dict.

In [1]:
#import xlrd
import os
import time
#from xlutils.copy import copy
#from xlrd import *
import dominate
import json
In [2]:
#wrkbook = xlrd.open_workbook('/home/wcmckee/whai/index.xls')
In [3]:
jsopn = open('/home/wcmckee/visignsys/index.json', 'r')
jsrdv = jsopn.read()
In [4]:
jsrdv
Out[4]:
'{"signin hrmin": "15:09:51", "reason": "ESW", "firstname": "William", "signin comment": "sunny and mum is back", "lastname": "Mckee", "signin date": "02-Dec-2014"}{"signin-date": "09-Dec-2014", "reason": "esw", "signin-comment": "checking this on the bus", "last-name": "mckee", "signin-hrmin": "08:30:54"}'
In [5]:
#print wrkbook.sheet_names()

#worksheet = wrkbook.sheet_by_name('visitor sign database')
#swlis = []
#num_rows = worksheet.nrows - 1
#curr_row = -1
#while curr_row < num_rows:
#    curr_row += 1
#    row = worksheet.row(curr_row)
    #print row
#    swlis.append(row)
In [6]:
#valis = []
In [7]:
#for swl in swlis[1]:
#    print swl.value
#    valis.append(swl.value)
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-7-4f71789a25a2> in <module>()
----> 1 for swl in swlis[1]:
      2     print swl.value
      3     valis.append(swl.value)

NameError: name 'swlis' is not defined
In [8]:
tiran = os.urandom(128).encode('hex')
reran = os.urandom(128).encode('hex')
comran = os.urandom(128).encode('hex')
In [7]:
inpcom = raw_input('comment: ')
comment: chexk
In [8]:
endate = time.strftime("%d" + "-" + "%b" + "-" + "%Y")
          
entim = time.strftime("%H:%M")
In [9]:
snoutm = {'out-date': endate}
snoutm.update({'out-time': entim})
snoutm.update({'out-comment': inpcom})
In [10]:
signoutdic = {endate: tiran}
timoutdic = {entim: reran}
In [11]:
signoutdic.update({entim:reran})
In [12]:
signoutdic.update({inpcom: comran})
In [13]:
signkeys = signoutdic.keys()
In [14]:
wha = open('/home/wcmckee/visignsys/index.json', 'a')

#w = copy(open_workbook('/home/wcmckee/whai/index.xls'))
#w.get_sheet(0).write(1,5, time.strftime("%d" + "-" + "%b" + "-" + "%Y"))
#w.get_sheet(0).write(1,6, time.strftime("%H:%M"))
#w.get_sheet(0).write(1,7, tiran)

#w.save('/home/wcmckee/whai/index.xls')
In [15]:
indsav = ('/home/wcmckee/whai/index.html')
In [16]:
opind = open(indsav, 'w')
In [17]:
import dominate
from dominate.tags import *

doc = dominate.document(title=wrkbook.sheet_names())

with doc.head:
    link(rel='stylesheet', href='style.css')
    script(type='text/javascript', src='script.js')

with doc:
    with div(id='header').add(ol()):
        for i in valis:
            li(a(i))

    with div():
        attr(cls='body')
        p('visitor sign database is open source. Visit https://github.com/wcmckee/wcmckee ')

#print doc
In [18]:
opind.write(str(doc))
In [19]:
opind.close()
In [20]:
liop = open('/home/wcmckee/visignsys/index.meta', 'a+')
liop.write(str(signkeys))
liop.close()
In [80]:
 
In [81]:
oplis = open('/home/wcmckee/visignsys/index.meta', 'r')
oplsav = oplis.read()
oplis.close()
In [82]:
trsor = tiran[0:12]
In [83]:
trsor
Out[83]:
'413c61c9654e'
In [84]:
optrd = open('/home/wcmckee/visignsys/posts/' + trsor + '.meta', 'w')
optrd.write(oplsav)
optrd.close()
In [85]:
jsnrd = open('/home/wcmckee/visignsys/posts/' + trsor + '.json', 'w')
jsnrd.write(oplsav)
jsnrd.close()
In [86]:
savpos = open('/home/wcmckee/visignsys/index.json', 'r')
signindi = savpos.read()
---------------------------------------------------------------------------
IOError                                   Traceback (most recent call last)
<ipython-input-86-011d26fbe418> in <module>()
----> 1 savpos = open('/home/wcmckee/visignsys/index.json', 'r')
      2 signindi = savpos.read()

IOError: [Errno 2] No such file or directory: '/home/wcmckee/visignsys/index.json'
In [87]:
jsnaccept = signindi.replace("'", "\"")
d = json.loads(jsnaccept)
In [88]:
snct = dict(d.items() + snoutm.items())
In [89]:
savpos.close()
In [90]:
os.chdir('/home/wcmckee/visignsys/posts')
In [91]:
lismet = os.listdir('/home/wcmckee/visignsys/posts')
In [92]:
lismet
Out[92]:
['df5aed94d944.meta',
 '5e98dfab4326.json',
 '8bf3b3c045b2.json',
 '5e98dfab4326.html',
 '2ce7b7e78b2d.meta',
 'a5fd59588711.html',
 '8bf3b3c045b2.meta',
 '4a1d7fbd4af7.html',
 '413c61c9654e.meta',
 '81ae6564478e.meta',
 '1ca6f0c7d074.html',
 'be7f3bba40a9.json',
 '1bca31594654.html',
 'be7f3bba40a9.html',
 '81ae6564478e.json',
 'be7f3bba40a9.meta',
 'd948bc2cb2d5.meta',
 '469b6be62e65.json',
 'f63a51c5660b.json',
 'f63a51c5660b.html',
 '4a1d7fbd4af7.json',
 'd7bfad4d84af.meta',
 '91ece852eb61.meta',
 '086ef98a8bea.meta',
 '231f0cbc6422.html',
 '413c61c9654e.json',
 '1ca6f0c7d074.meta',
 '5e98dfab4326.meta',
 '9479b52fcb96.meta',
 'a5fd59588711.meta',
 '6e8165886873.json',
 '3342b7e37622.json',
 '231f0cbc6422.meta',
 '1ca6f0c7d074.json',
 'd948bc2cb2d5.html',
 '4a1d7fbd4af7.meta',
 '6e8165886873.meta',
 'a5fd59588711.json',
 '6dbfbbb9c12e.html',
 '81ae6564478e.html',
 'd7bfad4d84af.json',
 '231f0cbc6422.json',
 'df5aed94d944.html',
 '469b6be62e65.meta',
 '9479b52fcb96.json',
 'f63a51c5660b.meta',
 '2ce7b7e78b2d.json',
 'df5aed94d944.json',
 '3342b7e37622.meta',
 '1bca31594654.meta',
 '1bca31594654.json',
 '6dbfbbb9c12e.meta',
 '086ef98a8bea.json',
 'd948bc2cb2d5.json']
In [99]:
opjsnz = []
In [100]:
for beca in lismet:
    if '.json' in beca:
        print beca
        opjsnz.append(beca)
5e98dfab4326.json
8bf3b3c045b2.json
be7f3bba40a9.json
81ae6564478e.json
469b6be62e65.json
f63a51c5660b.json
4a1d7fbd4af7.json
413c61c9654e.json
6e8165886873.json
3342b7e37622.json
1ca6f0c7d074.json
a5fd59588711.json
d7bfad4d84af.json
231f0cbc6422.json
9479b52fcb96.json
2ce7b7e78b2d.json
df5aed94d944.json
1bca31594654.json
086ef98a8bea.json
d948bc2cb2d5.json
In [121]:
optjz = []
optjsz = []
In [122]:
apgpls = []
In [124]:
#for opjw in opjsnz:
#    print opjw
#    optjsz.append(objw)
    
In [125]:
#for filop in opjsnz:
    #print filop
#    opt = open(opj, 'r')
#    thedict = str(opt.read())
#    thedict
#    opt.close()
In [126]:
#opt = open(opj, 'r')
In [127]:
#thedict = opt.read()
In [128]:
#thedict
Out[128]:
'{"signin hrmin": "19:11", "reason": "ESW", "name": "William Mckee", "signin date": "21-Oct-2014", "signin comment": "interested in watching swimmers on way back from walk"}'
In [129]:
#convgpj = json.loads(thedict)
In [130]:
#convgpj.values()
Out[130]:
[u'19:11',
 u'ESW',
 u'interested in watching swimmers on way back from walk',
 u'William Mckee',
 u'21-Oct-2014']
In [130]:
 
In [ ]: