brumbrum

Challenge 2

Speed Runs: optimising truck payload and cycle speeds

Description

Challenge Statement

At the Cowal Gold Mine, trucks transport ore over many km’s from the open pit. The more ore they take, the slower they move. Can you optimise for speed and tonnage?

Background

The Cowal operation is an open pit mining operation with production from a number of different areas within a large, single pit. As a result of the general site configuration and layout, haul distances, in particular for waste disposal, is the longest of all open pit operations across the company. Material is typically hauled for a 4km distance at a 1:10 incline and thereafter another 1-3km on a reasonably flat surface to the crusher, stockpiles or waste rock dumps depending on material type. A fleet monitoring system (MineStar) is installed and fully operational with a significant amount of individual truck cycle data available. Mining is carried out with a company-owned fleet of mining equipment that includes 2 x Liebherr 994 backhoe excavators, 1 x Hitachi 3600 Excavator and 16 x Cat 789 dump trucks. The Challenge

The key question remains: What is the optimum payload vs haul speed trade-off and how does that change with increases to the depth of the pit and haulage distances to future waste dump locations? The Opportunity

At present the operating philosophy is to “light load” trucks to ensure they reach at least second gear on the way out on the ramp. Payloads are controlled, however given the haul distance required, every tonne counts. If the optimum cycle speeds and payload can be determined, there is potential to improve productivity and reduce costs across the haulage fleet. Considerations

Critical areas to consider include:

For any given haulage cycle, can payload be “calibrated” to a specific location using historical VIMS data?
In excess of two years VIMS data is available for analysis that could be “mined” to establish the optimum trade-off!
Can payload optimisation be done in real time and displayed to the operator?

Data Summary

TBC Mentors

Ryan Kare - Lead Mentor

Elam Athimoolam

In [153]:
import pandas
import json
In [ ]:
 
In [ ]:
 
In [154]:
pitconf = pandas.read_excel('/home/wcmckee/data/Truck_Spec.xlsx', sheetname='SiteInfo', index_col='Parameter')
In [155]:
rimpulldata = pandas.read_excel('/home/wcmckee/data/Truck_Spec.xlsx', sheetname='Rimpull')
In [156]:
print(rimpulldata)
    Speed (km)   Force (kgf)
0     0.000000  85124.846100
1     1.269841  84181.974090
2     2.116402  81415.558050
3     3.386243  75308.980180
4     4.444444  69660.426470
5     5.291005  63721.832620
6     6.137566  58289.507510
7     6.984127  52729.697590
8     7.407407  50432.017810
9     8.465608  49811.731730
10    8.982951  48895.577950
11    9.805996  46534.040820
12   10.370370  43904.562710
13   10.793651  41372.434300
14   11.005291  39471.824240
15   11.216931  36920.178720
16   11.851852  36511.238150
17   12.275132  36106.827140
18   13.004115  35050.162540
19   13.474427  34278.007420
20   13.874192  33398.632820
21   14.203410  32461.371820
22   15.026455  29657.294330
23   15.238095  28718.205310
24   15.449735  27331.186050
25   16.084656  27028.456450
26   16.507937  26729.079990
27   17.330982  26043.367020
28   19.776602  23413.949310
29   20.129336  22756.887040
..         ...           ...
32   21.375661  20232.668330
33   22.433862  19786.942620
34   24.126984  18924.732900
35   25.396825  18100.093690
36   27.231041  16701.106460
37   27.654321  16292.804100
38   27.889477  15796.434640
39   28.148148  14977.793750
40   29.417989  14811.894570
41   30.264550  14647.832940
42   32.169312  14166.470980
43   34.074074  13549.171520
44   35.978836  12815.234900
45   37.037037  12394.096380
46   37.671958  11986.797470
47   37.883598  11722.728300
48   38.095238  11211.914150
49   39.788360  11087.727140
50   41.269841  10964.915670
51   42.962963  10723.358560
52   44.232804  10487.122960
53   48.583186   9522.120346
54   50.229277   9073.447255
55   50.723104   8895.549731
56   52.204586   8007.331433
57   53.380364   6893.723843
58   54.673721   5725.762322
59   54.861846   5186.037841
60   55.097002   4249.158596
61   55.261611   3848.622425

[62 rows x 2 columns]
In [114]:
rimpulldata
Out[114]:
Speed (km) Force (kgf)
0 0.000000 85124.846100
1 1.269841 84181.974090
2 2.116402 81415.558050
3 3.386243 75308.980180
4 4.444444 69660.426470
5 5.291005 63721.832620
6 6.137566 58289.507510
7 6.984127 52729.697590
8 7.407407 50432.017810
9 8.465608 49811.731730
10 8.982951 48895.577950
11 9.805996 46534.040820
12 10.370370 43904.562710
13 10.793651 41372.434300
14 11.005291 39471.824240
15 11.216931 36920.178720
16 11.851852 36511.238150
17 12.275132 36106.827140
18 13.004115 35050.162540
19 13.474427 34278.007420
20 13.874192 33398.632820
21 14.203410 32461.371820
22 15.026455 29657.294330
23 15.238095 28718.205310
24 15.449735 27331.186050
25 16.084656 27028.456450
26 16.507937 26729.079990
27 17.330982 26043.367020
28 19.776602 23413.949310
29 20.129336 22756.887040
... ... ...
32 21.375661 20232.668330
33 22.433862 19786.942620
34 24.126984 18924.732900
35 25.396825 18100.093690
36 27.231041 16701.106460
37 27.654321 16292.804100
38 27.889477 15796.434640
39 28.148148 14977.793750
40 29.417989 14811.894570
41 30.264550 14647.832940
42 32.169312 14166.470980
43 34.074074 13549.171520
44 35.978836 12815.234900
45 37.037037 12394.096380
46 37.671958 11986.797470
47 37.883598 11722.728300
48 38.095238 11211.914150
49 39.788360 11087.727140
50 41.269841 10964.915670
51 42.962963 10723.358560
52 44.232804 10487.122960
53 48.583186 9522.120346
54 50.229277 9073.447255
55 50.723104 8895.549731
56 52.204586 8007.331433
57 53.380364 6893.723843
58 54.673721 5725.762322
59 54.861846 5186.037841
60 55.097002 4249.158596
61 55.261611 3848.622425

62 rows × 2 columns

In [151]:
def createtruckspec(truckid, trucktype, emptyweight, maxoperatingweight):
    return({'truckid' : truckid, 'TruckType' : trucktype, 'EmptyWeight' : emptyweight, 'maxoperatingweight' : maxoperatingweight})
In [152]:
createtruckspec(450, "789C", 114114, 317514.85)
Out[152]:
{'EmptyWeight': 114114,
 'TruckType': '789C',
 'maxoperatingweight': 317514.85,
 'truckid': 450}
In [ ]:
 
In [ ]:
@app.route('/getruckspec')
def getruckspec():
    # give it truck id and returns back stats about that truck.
    # todo: return back truck segment details also. 
    #localhost:5555/getruckspec?truckid=410
    #returns back:
    #{"EmptyWeight": 124114, "MaxOperatingWeight": 317514.85, "TruckType": "789C"}
    truckid = request.args.get('truckid')
    #truckid = request.args.get("truckid")
    
    return(jsonify({'truckid' : truckid, 'EmptyWeight' : truckdicspec['EmptyWeight'][truckid],
        'MaxOperatingWeight' : truckdicspec['MaxOperatingWeight'][truckid], 'TruckType' : truckdicspec['TruckType'][truckid]}))
In [104]:
with open('/home/wcmckee/rimpulldata.json', 'w') as rimwr:
    rimwr.write(rimpulldata.to_json())
In [123]:
with open('/home/wcmckee/rimpulldata.json', 'r') as rimrd:
    rimrda = (rimrd.read())
In [133]:
forjs = json.loads(rimrda)
In [135]:
forjs['Speed (km)']['12']
Out[135]:
10.37037037
In [139]:
forjs['Force (kgf)']['12']
Out[139]:
43904.56271
In [ ]:
 
In [ ]:
 
In [145]:
def lookrimid(rimid):
    
    return({ 'speed' : forjs['Speed (km)'][str(rimid)], 'force' : forjs['Force (kgf)'][str(rimid)], 'id' : rimid})
In [146]:
lookrimid(12)
Out[146]:
{'force': 43904.56271, 'id': 12, 'speed': 10.37037037}
In [148]:
with open('/home/wcmckee/rimpulldata.json', 'r') as rimrd:
    rimjsconv = (rimrd.read())
          
forjs = json.loads(rimjsconv)
In [150]:
forjs['Force (kgf)']
Out[150]:
{'0': 85124.8461,
 '1': 84181.97409,
 '10': 48895.57795,
 '11': 46534.04082,
 '12': 43904.56271,
 '13': 41372.4343,
 '14': 39471.82424,
 '15': 36920.17872,
 '16': 36511.23815,
 '17': 36106.82714,
 '18': 35050.16254,
 '19': 34278.00742,
 '2': 81415.55805,
 '20': 33398.63282,
 '21': 32461.37182,
 '22': 29657.2943299999,
 '23': 28718.2053099999,
 '24': 27331.18605,
 '25': 27028.4564499999,
 '26': 26729.07999,
 '27': 26043.36702,
 '28': 23413.94931,
 '29': 22756.88704,
 '3': 75308.98018,
 '30': 22063.58565,
 '31': 20459.28236,
 '32': 20232.66833,
 '33': 19786.94262,
 '34': 18924.7329,
 '35': 18100.09369,
 '36': 16701.10646,
 '37': 16292.8041,
 '38': 15796.43464,
 '39': 14977.79375,
 '4': 69660.42647,
 '40': 14811.89457,
 '41': 14647.83294,
 '42': 14166.4709799999,
 '43': 13549.17152,
 '44': 12815.2349,
 '45': 12394.09638,
 '46': 11986.79747,
 '47': 11722.7283,
 '48': 11211.91415,
 '49': 11087.72714,
 '5': 63721.83262,
 '50': 10964.91567,
 '51': 10723.35856,
 '52': 10487.12296,
 '53': 9522.120346,
 '54': 9073.447255,
 '55': 8895.549731,
 '56': 8007.331433,
 '57': 6893.723843,
 '58': 5725.762322,
 '59': 5186.037841,
 '6': 58289.50751,
 '60': 4249.158596,
 '61': 3848.622425,
 '7': 52729.69759,
 '8': 50432.01781,
 '9': 49811.73173}
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [137]:
lookrimid(12)
Out[137]:
10.37037037
In [ ]:
 
In [ ]:
 
In [129]:
print(forjs['Speed (km)'])
{'23': 15.23809524, '30': 20.36449148, '32': 21.37566138, '14': 11.00529101, '4': 4.444444444, '15': 11.21693122, '5': 5.291005291, '12': 10.37037037, '47': 37.88359788, '45': 37.03703704, '51': 42.96296296, '6': 6.137566138, '3': 3.386243386, '61': 55.26161082, '59': 54.86184597, '38': 27.88947678, '25': 16.08465608, '1': 1.26984127, '39': 28.14814815, '0': 0.0, '10': 8.982951205, '34': 24.12698413, '50': 41.26984127, '19': 13.47442681, '49': 39.78835979, '21': 14.20340976, '41': 30.26455026, '24': 15.44973545, '31': 20.74074074, '7': 6.984126984, '58': 54.67372134, '44': 35.97883598, '43': 34.07407407, '37': 27.65432099, '48': 38.0952381, '28': 19.776602, '57': 53.38036449, '53': 48.58318636, '55': 50.72310406, '11': 9.805996473, '20': 13.87419165, '2': 2.116402116, '46': 37.67195767, '35': 25.3968254, '27': 17.33098178, '60': 55.09700176, '26': 16.50793651, '8': 7.407407407, '36': 27.23104056, '54': 50.2292769, '16': 11.85185185, '56': 52.20458554, '18': 13.00411523, '29': 20.12933568, '9': 8.465608466, '33': 22.43386243, '22': 15.02645503, '13': 10.79365079, '52': 44.23280423, '40': 29.41798942, '42': 32.16931217, '17': 12.27513228}
In [ ]:
 
In [ ]:
 
In [125]:
rimrda
Out[125]:
'{"Speed (km)":{"0":0.0,"1":1.26984127,"2":2.116402116,"3":3.386243386,"4":4.444444444,"5":5.291005291,"6":6.137566138,"7":6.984126984,"8":7.407407407,"9":8.465608466,"10":8.982951205,"11":9.805996473,"12":10.37037037,"13":10.79365079,"14":11.00529101,"15":11.21693122,"16":11.85185185,"17":12.27513228,"18":13.00411523,"19":13.47442681,"20":13.87419165,"21":14.20340976,"22":15.02645503,"23":15.23809524,"24":15.44973545,"25":16.08465608,"26":16.50793651,"27":17.33098178,"28":19.776602,"29":20.12933568,"30":20.36449148,"31":20.74074074,"32":21.37566138,"33":22.43386243,"34":24.12698413,"35":25.3968254,"36":27.23104056,"37":27.65432099,"38":27.88947678,"39":28.14814815,"40":29.41798942,"41":30.26455026,"42":32.16931217,"43":34.07407407,"44":35.97883598,"45":37.03703704,"46":37.67195767,"47":37.88359788,"48":38.0952381,"49":39.78835979,"50":41.26984127,"51":42.96296296,"52":44.23280423,"53":48.58318636,"54":50.2292769,"55":50.72310406,"56":52.20458554,"57":53.38036449,"58":54.67372134,"59":54.86184597,"60":55.09700176,"61":55.26161082},"Force (kgf)":{"0":85124.8461,"1":84181.97409,"2":81415.55805,"3":75308.98018,"4":69660.42647,"5":63721.83262,"6":58289.50751,"7":52729.69759,"8":50432.01781,"9":49811.73173,"10":48895.57795,"11":46534.04082,"12":43904.56271,"13":41372.4343,"14":39471.82424,"15":36920.17872,"16":36511.23815,"17":36106.82714,"18":35050.16254,"19":34278.00742,"20":33398.63282,"21":32461.37182,"22":29657.2943299999,"23":28718.2053099999,"24":27331.18605,"25":27028.4564499999,"26":26729.07999,"27":26043.36702,"28":23413.94931,"29":22756.88704,"30":22063.58565,"31":20459.28236,"32":20232.66833,"33":19786.94262,"34":18924.7329,"35":18100.09369,"36":16701.10646,"37":16292.8041,"38":15796.43464,"39":14977.79375,"40":14811.89457,"41":14647.83294,"42":14166.4709799999,"43":13549.17152,"44":12815.2349,"45":12394.09638,"46":11986.79747,"47":11722.7283,"48":11211.91415,"49":11087.72714,"50":10964.91567,"51":10723.35856,"52":10487.12296,"53":9522.120346,"54":9073.447255,"55":8895.549731,"56":8007.331433,"57":6893.723843,"58":5725.762322,"59":5186.037841,"60":4249.158596,"61":3848.622425}}'
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [121]:
for putf in pitconf.values:
    print(putf)
['60 km/h' nan]
['15.7 km/h' 'Locked in 2nd gear']
['28.7 km/h ' 'Locked in 4th gear']
['35 tonnes' nan]
['33.8 sec' 'Time taken to load a bucket']
['30 sec' 'Time to load first bucket']
['30 sec' 'Time taken to reverse under a loader']
['45 sec' nan]
In [ ]:
 
In [118]:
print(pitconf)
                                      Value  \
Parameter                                     
Max Speed Limit                     60 km/h   
Downhill Speed Limit - Loaded     15.7 km/h   
Downhill Speed Limit - Unloaded  28.7 km/h    
Loader bucket capacity            35 tonnes   
Loader cycle time                  33.8 sec   
First loader pass                    30 sec   
Truck spot time at loader            30 sec   
Dump time                            45 sec   

                                                             Comments  
Parameter                                                              
Max Speed Limit                                                   NaN  
Downhill Speed Limit - Loaded                      Locked in 2nd gear  
Downhill Speed Limit - Unloaded                    Locked in 4th gear  
Loader bucket capacity                                            NaN  
Loader cycle time                         Time taken to load a bucket  
First loader pass                           Time to load first bucket  
Truck spot time at loader        Time taken to reverse under a loader  
Dump time                                                         NaN  
In [ ]:
 
In [95]:
import json
In [ ]:
 
In [96]:
with open('/home/wcmckee/pitconfig.json', 'w') as pitw:
    pitw.write(pitconf.to_json())
In [97]:
with open('/home/wcmckee/pitconfig.json', 'r') as pitr:
    pitjs = pitr.read()
    pitlo = json.loads(pitjs)
    
In [98]:
print(pitlo)
{'Comments': {'Downhill Speed Limit - Loaded': 'Locked in 2nd gear', 'Downhill Speed Limit - Unloaded': 'Locked in 4th gear', 'Dump time': None, 'First loader pass': 'Time to load first bucket', 'Truck spot time at loader': 'Time taken to reverse under a loader', 'Max Speed Limit': None, 'Loader bucket capacity': None, 'Loader cycle time': 'Time taken to load a bucket'}, 'Value': {'Downhill Speed Limit - Loaded': '15.7 km/h', 'Downhill Speed Limit - Unloaded': '28.7 km/h ', 'Dump time': '45 sec', 'First loader pass': '30 sec', 'Truck spot time at loader': '30 sec', 'Max Speed Limit': '60 km/h', 'Loader bucket capacity': '35 tonnes', 'Loader cycle time': '33.8 sec'}}
In [88]:
pitlo.keys()
Out[88]:
dict_keys(['Comments', 'Value', 'Parameter'])
In [92]:
for pitl in pitlo.values():
    print(pitl['1'])
Locked in 2nd gear
15.7 km/h
Downhill Speed Limit - Loaded
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
truk= pandas.read_excel('/home/wcmckee/data/Truck_Spec.xlsx', index_col='Truckid')
In [ ]:
 
In [ ]:
print(truk)
In [ ]:
with open('/home/wcmckee/truckspec.json', 'w') as truc:
    truc.write(truk.to_json())
In [ ]:
cat /home/wcmckee/truckspec.json
In [ ]:
with open('/home/wcmckee/truckspec.json', 'r') as truatm:
    tread = truatm.read()
    #print(truatm.read())
In [ ]:
truckdicspec = json.loads(tread)
In [ ]:
 
In [ ]:
truckdicspec
In [ ]:
truckdicspec['TruckType']['418']
In [ ]:
def gettruck(truckid):
           with open('/home/wcmckee/truckspec.json', 'r') as truatm:
               tread = truatm.read()
           truckdicspec = json.loads(tread)
           return({'EmptyWeight' : truckdicspec['EmptyWeight'][truckid],
        'MaxOperatingWeight' : truckdicspec['MaxOperatingWeight'][truckid], 'TruckType' : truckdicspec['TruckType'][truckid]})
In [ ]:
gettruck('410')
In [ ]:
import json
In [ ]:
tred = json.loads(trurd.read())
In [ ]:
#create database of trucks from existing file. 
In [ ]:
somedict = dict()
In [ ]:
somedict.update({truk['Truck ID'] : dict({'test' : 'this is test'})})
In [ ]:
 
In [ ]:
for truk['Truck ID'] in truk:
    print(truk['Truck ID'])
In [ ]:
 
In [ ]:
 
In [ ]:
truk.values
In [ ]:
truk.keys()
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
with open('/home/wcmckee/truckspec.json', 'r') as trurd:
    #print(trurd.read())
    tred = json.loads(trurd.read())
    print(tred)
    
In [ ]:
import sqlite3
In [ ]:
connid = sqlite3.connect('identity.db')
In [ ]:
connid
In [ ]:
c = connid.cursor()
In [ ]:
def createdb(nameofdb):
    connid = sqlite3.connect('{}.db'.format(nameofdb))
    c.execute('''CREATE TABLE truckspec
             (truckid, trucktype, emptyweight, maxgrossoperatingweight)''')

    c = connid.cursor()
    
    c.close()

    return(nameofdb)
In [ ]:
createdb('testing')
In [ ]:
 
In [ ]:
createdb('truckspec')
In [ ]:
 
In [ ]:
c.execute('''CREATE TABLE truckspec
             (truckid, trucktype, emptyweight, maxgrossoperatingweight)''')
In [ ]:
def createtruk(truckid, trucktype, emptyweight, maxgrossoperatingweight):
    connid = sqlite3.connect('truckspec.db')
    c = connid.cursor()
    c.execute("INSERT INTO truckspec VALUES ('{}','{}','{}', '{}')".format(truckid, trucktype, emptyweight, maxgrossoperatingweight))
    connid.commit()
    connid.close()

    return({truckid : dict({'trucktype' : trucktype, 'emptyweight' : emptyweight,
                           'maxgrossoperatingweight' : maxgrossoperatingweight})})
In [ ]:
def select_all_tasks():
    """
    Query all rows in the tasks table
    :param conn: the Connection object
    :return:
    """
    connid = sqlite3.connect('truckspec.db')

    cur = connid.cursor()
    cur.execute("SELECT * truckspec;")
 
    rows = cur.fetchall()
 
    for row in rows:
        print(row)
In [ ]:
select_all_tasks()
In [ ]:
 
In [ ]:
 
In [ ]:
createtruk(420, '789C', 124114, 317514.85)
In [ ]:
createtruk(421, '789C', 114114, 317514.85)
In [ ]:
cat /home/wcmckee/truckspec.json
In [ ]:
truk.to_json()
In [ ]:
truk.sort_values(by='Truck ID')

an api on truck spec

{401 : dict({'type' : '789c', 'emptyweight' : 124114, 'maxweight' : 317514.85})}

In [ ]:
forohon = ({401 : dict({'type' : '789c', 'emptyweight' : 124114, 'maxweight' : 317514.85})})
In [ ]:
forohon
In [ ]:
tru.loc[tru['Truck'] == 'TRH404']
In [ ]:
tulo = tru.loc('TRH404')
In [ ]:
tulo.name
In [ ]:
print(tulo())
In [ ]:
roadsegdata = {'Road-segment-data' : dict({'Truck' : 'ID of the 16 CAT trucks', 
                             'Description - CR' : 'Grade the truck needs to climb',
                            'Payload' : 'Weight in kg excluding truck weight.',
                            'Start velocity' : 'Force the truck starts the grade in km/hr.',
                            'End velocity' : 'Force the truck exits the grade in km/hr.',
                            'EfhLength' : 'Not valid as it is an old measure of distance.', 
                            'Target duration' : 'Optimal time for the truck to cover the grade in seconds.', 
                            'Duration calculated' : 'Actual time for the truck to cover the grade in seconds.',
                            'Slope length' : 'Slope length(curvature distance) in meters.',
                            'Rise height' : 'Measure of the slope in meters.',
                            'StartWayPoint X,Y,Z' : 'Starting location of the grade',
                            'EndWayPoint X,Y,Z' : 'Ending location of the grade.'})}
In [ ]:
print(roadsegdata)
In [ ]:
truckcycledict = {'Truck cycle' : dict({'Date' : 'Date the data was recorded.', 
                       'Source stage' : 'Stage the truck starts from.',
                      'Source bench' : 'Bench the truck starts from.',
                      'Destination name' : 'Unloading dump.', 
                      'Truck' : 'ID of the 16 CAT trucks',
                      'Travelling empty duration' : 'Travelling time in seconds when the truck is empty. (return time duration',
                      'Travelling full duration' : 'Travelling time in seconds when the truck is loaded.',
                      'Payload' : 'Weight in kg excluding truck weight.',
                      'Full slope length' : 'Distance in meters from bench to dump location.',
                      'Empty slope length' : 'Return distance in meters.',
                      'Inpit ramp length' : 'Distance from the bench to the exit of the pit in meters.', 
                     'Inpit ramp grade' : 'Inclination angle of the in-pit grade in %',
                      'Dump ramp length' : 'Distance from the exit of pit to the dump location.', 
                      'Dump ramp grade' : 'Inclination angle after the exit of pit to the dump location in %.'})}
In [ ]:
trucyc = truckcycledict['Truck cycle']
In [ ]:
print(trucyc)
In [ ]:
for truc in trucyc:
    print(truc)
In [ ]:
truckcycledict.update(roadsegdata)
In [ ]:
truckcy = json.dumps(truckcycledict)
In [ ]:
with open('/home/wcmckee/truckscycroad.json', 'w') as t:
    t.write(truckcy)
In [ ]:
 
In [159]:
with open('/home/wcmckee/truckscycroad.json', 'r') as r:
    #print(r.read())
    decjs = json.loads(r.read())
In [ ]:
print({410 : dict({'Truck' : 410, 'Payload' : 100, })})
In [160]:
print(decjs['Road-segment-data'])
{'Target duration': 'Optimal time for the truck to cover the grade in seconds.', 'Truck': 'ID of the 16 CAT trucks', 'End velocity': 'Force the truck exits the grade in km/hr.', 'Duration calculated': 'Actual time for the truck to cover the grade in seconds.', 'Payload': 'Weight in kg excluding truck weight.', 'StartWayPoint X,Y,Z': 'Starting location of the grade', 'Description - CR': 'Grade the truck needs to climb', 'Rise height': 'Measure of the slope in meters.', 'EndWayPoint X,Y,Z': 'Ending location of the grade.', 'Start velocity': 'Force the truck starts the grade in km/hr.', 'EfhLength': 'Not valid as it is an old measure of distance.', 'Slope length': 'Slope length(curvature distance) in meters.'}
In [161]:
print(decjs['Truck cycle'])
{'Travelling full duration': 'Travelling time in seconds when the truck is loaded.', 'Inpit ramp length': 'Distance from the bench to the exit of the pit in meters.', 'Truck': 'ID of the 16 CAT trucks', 'Destination name': 'Unloading dump.', 'Empty slope length': 'Return distance in meters.', 'Date': 'Date the data was recorded.', 'Dump ramp length': 'Distance from the exit of pit to the dump location.', 'Payload': 'Weight in kg excluding truck weight.', 'Full slope length': 'Distance in meters from bench to dump location.', 'Dump ramp grade': 'Inclination angle after the exit of pit to the dump location in %.', 'Inpit ramp grade': 'Inclination angle of the in-pit grade in %', 'Source bench': 'Bench the truck starts from.', 'Travelling empty duration': 'Travelling time in seconds when the truck is empty. (return time duration', 'Source stage': 'Stage the truck starts from.'}
In [ ]:
cat /home/wcmckee/truckscycroad.json
In [ ]:
for trucy in truckcycledict['Road-segment-data']:
    print({trucy : dict({'example' : 'l33t', 'Description' : 'this is an example'})})
    
    
In [ ]:
def createlocation(truck, longitude, latitude):
    return({'truck' : truck, 'longitude' : longitude, 'latitude' : latitude})
In [ ]:
import pandas
In [ ]:
descri = pandas.read_excel('/home/wcmckee/data/Road Segment Data.xlsx')
In [ ]:
dropcol = descri.drop('Unnamed: 0', axis=1)
In [ ]:
with open('/home/wcmckee/roadsegdatatest.json', 'w') as roaddrop:
    roaddrop.write(dropcol.to_json())
In [ ]:
with open('/home/wcmckee/roadsegdatatest.json', 'r') as roard:
    roadjsconv = roard.read()
In [ ]:
import json
In [ ]:
roadjslod = json.loads(roadjsconv)
In [ ]:
import getpass
In [ ]:
getpass.getuser()
In [ ]:
 
In [ ]:
for paka in roadjslod.keys():
    print(roadjslod[paka]['127'])
In [ ]:
startdict ={'StartWaypoint' : dict({'x' : int(samptru['StartWaypointCoordsX']), 
                         'y' : int(samptru['StartWaypointCoordsY']), 
                         'z' : int(samptru['StartWaypointCoordsZ']),
                        'velocity' : int(samptru['StartVelocity'])})}
In [ ]:
startdict
In [ ]:
def xyz(startx, starty, startz, startvelo, endx, endy, endz, endvelo):
           return({'startwaypoint' : dict({'x' : startx, 
                         'y' : starty, 
                         'z' : startz,
                        'velocity' : startvelo}),
                  'endwaypoint' : dict({'x' : endx, 'y' : endy, 'z' : endz, 
                                        'velocity' : endvelo})})
In [ ]:
xyz(86545, 36238, 1097, 13, 86578, 36354, 1097, 13)
In [ ]:
endist = {'EndWaypoint' : dict({'x' : int(samptru['EndWaypointCoordsX']), 
                       'y' : int(samptru['EndWaypointCoordsY']),
                      'z' : int(samptru['StartWaypointCoordsZ']),
                      'velocity' : int(samptru['EndVelocity'])})}
In [ ]:
startdict.update(endist)
In [ ]:
{'stats' : dict({'Description-CR' : samptru['Description-CR']})}
In [ ]:
samdes = samptru['Description-CR']
In [ ]:
import requests
In [ ]:
reqge = requests.get('https://geocode.xyz/51.50354,-0.12768')
In [ ]:
rejs = (reqge.json)
In [ ]:
print(str(rejs))
In [ ]:
print(dict({'truck' : truck, dict({'startwaypoint' : {'longitude' : int(samptru['StartWaypointCoordsX'], 'latitude' : int(samptru['StartWaypointCoordsY']))}}
In [ ]:
print ({'truck' : 402, dict({'startwaypoint' : dict({'longitude' : 1, 'latitude' : 3})})
In [ ]:
{'startwaypoint' : dict({'longitude' : 1, 'latitude' : 3})}
In [157]:
trukxlc = pandas.read_excel('/home/wcmckee/data/Scenario1.xlsx')
In [ ]:
 
In [158]:
print(trukxlc['Destination 1'])
0      Material Carried: 50%
1                          Y
2                    36277.7
3                      36284
4                    36262.6
5                    36256.8
6                    36239.5
7                    36228.9
8                    36214.3
9                    36206.9
10                   36203.7
11                   36202.4
12                   36203.7
13                   36216.5
14                   36230.2
15                   36255.4
16                   36271.1
17                   36287.6
18                   36369.5
19                   36399.5
20                   36418.2
21                   36445.7
22                   36441.9
23                   36431.9
24                   36375.1
25                   36331.4
26                   36291.2
27                     36224
28                   36195.9
29                   36176.6
               ...          
82                   36715.2
83                   36653.3
84                   36620.6
85                   36601.8
86                   36540.6
87                   36530.5
88                   36515.1
89                   36458.7
90                   36383.8
91                   36316.8
92                   36287.3
93                   36270.7
94                   36222.7
95                   36156.4
96                   36094.5
97                   36042.8
98                   35983.4
99                   35970.8
100                  35962.4
101                      NaN
102                      NaN
103                      NaN
104                      NaN
105                      NaN
106                      NaN
107                      NaN
108                      NaN
109                      NaN
110                      NaN
111                      NaN
Name: Destination 1, dtype: object
In [ ]:
trukcyc = pandas.read_csv('/home/wcmckee/data/Truck Cycle.csv', index_col='Truck')
In [ ]:
for trkey in trukcyc.keys():
    trk = (trkey.replace(' ', '-'))
    print(trk.lower())
In [ ]:
import json
In [ ]:
import random
In [ ]:
print(truckdicspec)
In [ ]:
with open('/home/wcmckee/truckspec.json', 'r') as truatm:
        tread = truatm.read()
        truckdicspec = json.loads(tread)
        truckid = random.choice(truckdicspec)
In [ ]:
truits = trukcyc.iteritems()
In [ ]:
somedict = dict()
In [ ]:
somedict
In [ ]:
somelis = list()
In [ ]:
 
In [ ]:
for tru in truits:
    #print(tru)
    somelis.append(tru)
    #somedict.update({tru : tru})
In [ ]:
for somel in somelis:
    print(somel)
In [ ]:
for somel in somelis:
    print(somel)
In [ ]:
import random
In [ ]:
random.choice()
In [ ]:
def createtruckcyc(date, sourcestage, sourcebench, destinationname,
       travellingemptyduration, travellingfullduration, payload,
       fullslopelength, emptyslopelength, inpitramplength,
       inpitrampgrade, dumpramplength, dumprampgrade):
        return (dict({'date' : date, 'sourcestage' : sourcebench, 'sourcebench' : sourcebench, 
                      'destinationname' : destinationname, 
                      'travellingemptyduration' : travellingemptyduration, 
                      'travellingfullduration' : travellingfullduration, 
                      'payload' : payload, 'fullslopelength' : fullslopelength, 
                      'emptyslopelength' : emptyslopelength, 
                      'inpitramplength' : inpitramplength, 
                      'inpitrampgrade' : inpitrampgrade, 'dumpramplength' : dumpramplength, 
                      'dumprampgrade' : dumprampgrade}))
In [ ]:
import arrow
In [ ]:
timnow = arrow.now()
In [ ]:
print(timnow)
In [ ]:
str(timnow.date())
In [ ]:
'9.34%', 336.00, '8.33%'
In [ ]:
createtruckcyc('01/01/2017', 'Stage G', '920', 'MW_PRIME', 0.0, 1009, 179.7, 3888.21, 3888.21,
              42.94, 3108.02, '9.34%', 336.00, '8.33%')
In [ ]:
trjs = trukcyc.to_json
In [ ]:
 
In [ ]:
 
In [ ]:
with open('/home/wcmckee/truckdetail.json', 'w') as truckdet:
    truckdet.write(trjs)
In [ ]:
trukd = pandas.read_json('/home/wcmckee/truckdetail.json')
In [ ]:
trukcyc.to_json 
In [ ]:
 
In [ ]:
trujsn = (trukcyc.to_json)
In [ ]:
trujsn
In [ ]:
trujsn
In [ ]:
with open('/home/wcmckee/truckinfo.json', 'w') as truwri:
    truwri.write(str(trujsn))
In [ ]:
 
In [ ]:
cat /home/wcmckee/truckinfo.json
In [ ]:
import arrow
In [ ]:
 
In [ ]:
def mkblogpost(blogpath, postname, tagblog):
    raw = arrow.now()
    fultim = raw.datetime
    
    if postname + '.md' not in os.listdir(blogpath + '/posts'):
        with open(blogpath + '/posts/' + postname + '.meta', 'w') as daympo:
            daympo.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: {}\n.. link:\n.. description:\n.. type: text'.format(postname, postname, fultim, tagblog))
            
        with open(blogpath + '/posts/' + postname + '.md', 'w') as daymark:
            for toar in os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d')):

                daymark.write('![{}]({}{})\n\n'.format(toar.replace('.png', ''), '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d') + '/', toar))
In [ ]:
print(trukcyc['Destination name'])
In [ ]:
from flask import Flask, request, jsonify
import json

import getpass
myusr = getpass.getuser()
homepath = '/home/{}'.format(myusr)
app = Flask(__name__)
In [ ]:
with open('{}/truckspec.json'.format(homepath), 'r') as truatm:
    #opens the truck spec json file. 
    tread = truatm.read()
    
truckdicspec = json.loads(tread)

with open('{}/roadsegdatatest.json'.format(homepath), 'r') as roard:
    #opens the road seg data 
    roadjsconv = roard.read()
    
roadjslod = json.loads(roadjsconv)
      
In [ ]:
 
In [ ]:
@app.route('/getruckspec')
def getruckspec():
    # give it truck id and returns back stats about that truck.
    # todo: return back truck segment details also. 
    #localhost:5555/getruckspec?truckid=410
    #returns back:
    #{"EmptyWeight": 124114, "MaxOperatingWeight": 317514.85, "TruckType": "789C"}
    truckid = request.args.get('truckid')
    #truckid = request.args.get("truckid")
    
    return(jsonify({'EmptyWeight' : truckdicspec['EmptyWeight'][truckid],
        'MaxOperatingWeight' : truckdicspec['MaxOperatingWeight'][truckid], 'TruckType' : truckdicspec['TruckType'][truckid]}))
In [ ]:
@app.route('/getroadsegid')
def getroadsegid():
    #localhost:5555/getroadsegid?segid=95
    #give it id and return back road segiment details. 
    '''
    returns back: {
  "Description-CR": "GRD25-GRD26", 
  "DurationCalculated": 28.0, 
  "EfhLength": 157.9984208252, 
  "EndVelocity": 21.9239999982, 
  "EndWaypointCoordsX": 85976.84, 
  "EndWaypointCoordsY": 36378.8, 
  "EndWaypointCoordsZ_Q-CR": 885.02, 
  "LoadStateDirection_D-CR": "Reverse traversal full", 
  "Payload": 170900, 
  "RiseHeight": 0.15, 
  "SlopeLength": 148.8954144547, 
  "StartVelocity": 13.3919999989, 
  "StartWaypointCoordsX": 85987.63, 
  "StartWaypointCoordsY": 36257.25, 
  "StartWaypointCoordsZ": 885.05, 
  "TargetDuration": 11, 
  "Truck": "TRH405"
}
'''
    segid = request.args.get('segid')
    newdict = dict()

    for paka in roadjslod.keys():
        newdict.update({paka : roadjslod[paka][segid]})
    return(jsonify(newdict))
    #return('everything worked')


if __name__ == '__main__':
    app.run(host='0.0.0.0', port=5555) 
In [ ]:
from flask import Flask, request, jsonify
import json

import getpass
myusr = getpass.getuser()
homepath = '/home/{}'.format(myusr)
app = Flask(__name__)


with open('{}/truckspec.json'.format(homepath), 'r') as truatm:
    #opens the truck spec json file. 
    tread = truatm.read()
    
truckdicspec = json.loads(tread)

with open('{}/roadsegdatatest.json'.format(homepath), 'r') as roard:
    #opens the road seg data 
    roadjsconv = roard.read()
    
roadjslod = json.loads(roadjsconv)
      
@app.route('/getruckspec')
def getruckspec():
    # give it truck id and returns back stats about that truck.
    # todo: return back truck segment details also. 
    #localhost:5555/getruckspec?truckid=410
    #returns back:
    #{"EmptyWeight": 124114, "MaxOperatingWeight": 317514.85, "TruckType": "789C"}
    truckid = request.args.get('truckid')
    #truckid = request.args.get("truckid")
    
    return(jsonify({'EmptyWeight' : truckdicspec['EmptyWeight'][truckid],
        'MaxOperatingWeight' : truckdicspec['MaxOperatingWeight'][truckid], 'TruckType' : truckdicspec['TruckType'][truckid]}))

@app.route('/getroadsegid')
def getroadsegid():
    #localhost:5555/getroadsegid?segid=95
    #give it id and return back road segiment details. 
    '''
    returns back: {
  "Description-CR": "GRD25-GRD26", 
  "DurationCalculated": 28.0, 
  "EfhLength": 157.9984208252, 
  "EndVelocity": 21.9239999982, 
  "EndWaypointCoordsX": 85976.84, 
  "EndWaypointCoordsY": 36378.8, 
  "EndWaypointCoordsZ_Q-CR": 885.02, 
  "LoadStateDirection_D-CR": "Reverse traversal full", 
  "Payload": 170900, 
  "RiseHeight": 0.15, 
  "SlopeLength": 148.8954144547, 
  "StartVelocity": 13.3919999989, 
  "StartWaypointCoordsX": 85987.63, 
  "StartWaypointCoordsY": 36257.25, 
  "StartWaypointCoordsZ": 885.05, 
  "TargetDuration": 11, 
  "Truck": "TRH405"
}
'''
    segid = request.args.get('segid')
    newdict = dict()

    for paka in roadjslod.keys():
        newdict.update({paka : roadjslod[paka][segid]})
    return(jsonify(newdict))
    #return('everything worked')


if __name__ == '__main__':
    app.run(host='0.0.0.0', port=5555) 

localmeasure-bday

In [1]:
import json
In [ ]:
 
In [2]:
import arrow
In [3]:
def getwholefil(width, height):
    return(width * height)
In [4]:
def givearray(inputarray):
    return(list(inputarray))
In [5]:
def retwatera(wid, hei, mylis):
    return(getwholefil(wid,hei) - sum(givearray(mylis)))
In [6]:
#want to test that the two numbers are int and not str
In [7]:
givearray([2,5,6,2,3])
Out[7]:
[2, 5, 6, 2, 3]
In [8]:
retwatera(5,20, [2,3,6,2,3])
Out[8]:
84
In [9]:
type(givearray('2, 5, 6, 6, 2'))
Out[9]:
list
In [10]:
givearray('2,5,6,6,2')
Out[10]:
['2', ',', '5', ',', '6', ',', '6', ',', '2']
In [11]:
getwholefil(5,10)
Out[11]:
50
In [12]:
thearray = [1,3,5,2,3]
In [13]:
totfil = 50
In [14]:
getwholefil(5,10) - sum(thearray)
Out[14]:
36
In [15]:
with open('/home/wcmckee/local.json', 'r') as locj:
    print(locj.read())
    jsrd = json.loads(locj.read())
{"profiles": [{"source_id": "87235872", "link": "https://facebook.com/john.smith.478653", "source": "facebook"}, {"source_id": "245986569842", "link": "https://instagram.com/johnsmith/", "source": "instagram"}, {"source_id": "72735729779824", "link": "https://twitter.com/johnnysmiddy/", "source": "twitter"}, {"source_id": "37461828371", "link": "https://salesforce.com/customer/abc123", "source": "salesforce"}], "customer_id": "00:14:22:01:23:45", "traits": {"birthdate": "1974-08-01", "gender": "male", "marketing_consent": true, "loyalty_level": "Elite Plus", "email": "john.smith@gmail.com", "last_name": "Smith", "loyalty_number": "AU8759342", "first_name": "John"}}
---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
<ipython-input-15-bf3aab5465f0> in <module>()
      1 with open('/home/wcmckee/local.json', 'r') as locj:
      2     print(locj.read())
----> 3     jsrd = json.loads(locj.read())

/usr/lib/python3.5/json/__init__.py in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    317             parse_int is None and parse_float is None and
    318             parse_constant is None and object_pairs_hook is None and not kw):
--> 319         return _default_decoder.decode(s)
    320     if cls is None:
    321         cls = JSONDecoder

/usr/lib/python3.5/json/decoder.py in decode(self, s, _w)
    337 
    338         """
--> 339         obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    340         end = _w(s, end).end()
    341         if end != len(s):

/usr/lib/python3.5/json/decoder.py in raw_decode(self, s, idx)
    355             obj, end = self.scan_once(s, idx)
    356         except StopIteration as err:
--> 357             raise JSONDecodeError("Expecting value", s, err.value) from None
    358         return obj, end

JSONDecodeError: Expecting value: line 1 column 1 (char 0)
In [19]:
from tinydb import TinyDB, Query
db = TinyDB('/home/wcmckee/db.json')
In [3]:
import tinydb
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-3-6c7c253d0708> in <module>()
----> 1 import tinydb

ImportError: No module named 'tinydb'
In [26]:
import sqlite3
In [27]:
conn = sqlite3.connect('example.db')
In [28]:
c = conn.cursor()

# Create table
c.execute('''CREATE TABLE identify
             (customer_id, first_name, last_name, email, birthdate, gender, marketing_consent)''')

# Insert a row of data
c.execute("INSERT INTO identify VALUES ('00:14:22:01:23:45','William','Mckee','hammer@gmail.com', '1974-08-01', 'male', 'True')")

# Save (commit) the changes
conn.commit()

# We can also close the connection if we are done with it.
# Just be sure any changes have been committed or they will be lost.
conn.close()
ERROR:root:An unexpected error occurred while tokenizing input
The following traceback may be corrupted or invalid
The error message is: ('EOF in multi-line string', (1, 94))

---------------------------------------------------------------------------
OperationalError                          Traceback (most recent call last)
<ipython-input-28-948fa5b1e7da> in <module>()
      3 # Create table
      4 c.execute('''CREATE TABLE identify
----> 5              (customer_id, first_name, last_name, email, birthdate, gender, marketing_consent)''')
      6 
      7 # Insert a row of data

OperationalError: table identify already exists
In [ ]:
c = conn.cursor()
c.execute("INSERT INTO identify VALUES ('00:14:22:01:23:45','William','Mckee','hammer@gmail.com', '1974-08-01', 'male', 'True')")
conn.commit()
conn.close()
In [29]:
conn = sqlite3.connect('example.db')
c = conn.cursor()
c.execute("INSERT INTO identify VALUES ('00:14:22:01:23:45','William','Mckee','hammer@gmail.com', '1974-08-01', 'male', 'True')")
conn.commit()
conn.close()
In [ ]:
 
In [30]:
conn = sqlite3.connect('example.db')
In [31]:
cur = conn.cursor()
In [35]:
import sqlite3

persons = [
    ("Hugo", "Boss"),
    ("Calvin", "Klein")
    ]

con = sqlite3.connect(":memory:")

# Create the table
con.execute("create table person(firstname, lastname)")

# Fill the table
con.executemany("insert into person(firstname, lastname) values (?, ?)", persons)

# Print the table contents
for row in con.execute("select firstname, lastname from person"):
    print (row)

#print "I just deleted", con.execute("delete from person").rowcount, "rows"
('Hugo', 'Boss')
('Calvin', 'Klein')
In [ ]:
from tinydb import TinyDB, Query
db = TinyDB('path/to/db.json')
#>>> User = Query()
#>>> db.insert({'name': 'John', 'age': 22})
#>>> db.search(User.name == 'John')
In [ ]:
import tinydb
In [34]:
for row in cur.execute("select first_name"):
    print(row)
---------------------------------------------------------------------------
OperationalError                          Traceback (most recent call last)
<ipython-input-34-fef3ecde143d> in <module>()
----> 1 for row in cur.execute("select first_name"):
      2     print(row)

OperationalError: no such column: first_name
In [25]:
cur.fetchall()
Out[25]:
[]
In [248]:
def createprofile(customer_id, profilesource):
    return('The customer id is {} and the profile source is {}'.format(customer_id, profilesource))
In [249]:
createprofile('hammer@gmail.com', 'facebook')
Out[249]:
'The customer id is hammer@gmail.com and the profile source is facebook'
In [24]:
def createfullprofile(first_name, last_name, email, marketing_consent, birthdate, gender):
    db.insert({'first_name' : first_name, 'last_name' : last_name, 'email' : email, 'marketing_consent' : marketing_consent, 'birthdate' : birthdate, 'gender' : gender})
    #return('Hello {} {}. Your email is {}. Marketing opt is {}. Your birthdate is {} and you are a {}'.format(first_name, last_name, email, marketing_consent, birthdate, gender))
    return({'first_name' : first_name, 'last_name' : last_name, 'email' : email, 'marketing_consent' : marketing_consent, 'birthdate' : birthdate, 'gender' : gender})
In [27]:
createfullprofile('something', 'else', 'law123@gmail.com', True, '04/12/1001', 'male')
Out[27]:
{'birthdate': '04/12/1001',
 'email': 'law123@gmail.com',
 'first_name': 'something',
 'gender': 'male',
 'last_name': 'else',
 'marketing_consent': True}
In [39]:
currentime = arrow.now()
In [252]:
birthday = arrow.get('1865-12-08')
In [253]:
print(birthday)
1865-12-08T00:00:00+00:00
In [254]:
currentime - birthday
Out[254]:
datetime.timedelta(55819, 16397, 466220)
In [255]:
birthday.date()
Out[255]:
datetime.date(1865, 12, 8)
In [256]:
birthday.strftime('%d')
Out[256]:
'08'
In [257]:
birthday.strftime('%m')
Out[257]:
'12'
In [258]:
birthday.strftime('%Y')
Out[258]:
'1865'
In [ ]:
 

first_name string true First name of the customer

last_name string true Last name of the customer

email string false The email of the customer

marketing_consent boolean false Whether the customer gives consent to receive marketing material

birthdate string false The birthdate of the customer in the format YYYY-MM-DD

gender string false The gender of the customer

avatar_image string false An image representing the customer

bio string false A brief description of the customer

hometown string false The home town of the customer

link string false A link to the original customer

website string false The customers website

In [259]:
 {
    "customer_id": "00:14:22:01:23:45",
    "profiles": [{
        "source": "facebook",
        "source_id": "87235872",
        "link": "https://facebook.com/john.smith.478653",
    }
  File "<ipython-input-259-9f499cada62d>", line 7
    }
     ^
SyntaxError: unexpected EOF while parsing
In [260]:
with open('/home/wcmckee/local.json', 'r') as locj:
    myjs = json.loads(locj.read())
In [261]:
thetraits = {"traits": {
        "first_name": "John",
        "last_name": "Smith",
        "email": "john.smith@gmail.com",
        "loyalty_level": "Elite Plus",
        "loyalty_number": "AU8759342",
        "birthdate": "1974-10-01",
        "gender": "male",
        "marketing_consent": True
    }}
In [262]:
arge = arrow.get(thetraits['traits']['birthdate'], 'YYYY-MM-DD')
In [263]:
arge.strftime('%m')
Out[263]:
'10'
In [264]:
arge.strftime('%d')
Out[264]:
'01'
In [265]:
currentime.date()
Out[265]:
datetime.date(2018, 10, 6)
In [266]:
currentime.strftime('%m')
Out[266]:
'10'
In [267]:
import requests
In [268]:
unbreq = requests.get('https://api.giphy.com/v1/gifs/translate?api_key=123=unbirthday')
In [269]:
unbjs = unbreq.json()
In [270]:
unbjs['data']['images']['original']['url']
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-270-dace88d96482> in <module>()
----> 1 unbjs['data']['images']['original']['url']

KeyError: 'data'
In [271]:
if arge.strftime('%m') == currentime.strftime('%m'):
    print('it is your birthday month')
else:
    print('it is not your birthday month')
it is your birthday month
In [272]:
if arge.strftime('%m-%d') == currentime.strftime('%d-%m'):
    print('it is your birthday')
    breq = requests.get('https://api.giphy.com/v1/gifs/random?api_key=123&tag=happy birthday&rating=G')
    bjs = unbreq.json()
    print(bjs['data']['images']['original']['url'])
else:
    print('it is not your birthday')
    unbreq = requests.get('https://api.giphy.com/v1/gifs/translate?api_key=123&s=happy unbirthday')
    unbjs = unbreq.json()
    print(unbjs['data']['images']['original']['url'])
it is not your birthday
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-272-87d617035651> in <module>()
      8     unbreq = requests.get('https://api.giphy.com/v1/gifs/translate?api_key=123&s=happy unbirthday')
      9     unbjs = unbreq.json()
---> 10     print(unbjs['data']['images']['original']['url'])

KeyError: 'data'
In [273]:
breq = requests.get('https://api.giphy.com/v1/gifs/translate?api_key=123&s=birthday')
bjs = breq.json()
print(bjs['data']['images']['original']['url'])
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-273-c0bf7608b94e> in <module>()
      1 breq = requests.get('https://api.giphy.com/v1/gifs/translate?api_key=123&s=birthday')
      2 bjs = breq.json()
----> 3 print(bjs['data']['images']['original']['url'])

KeyError: 'data'
In [274]:
arge.strftime('%m-%d') == currentime.strftime('%d-%m')
Out[274]:
False
In [275]:
print(arge.strftime('%m-%d'))
10-01
In [276]:
print(currentime.strftime('%d-%m'))
06-10
In [277]:
abs(int(arge.strftime('%Y')) - int(currentime.strftime('%Y')))
Out[277]:
44
In [278]:
thetraits['traits']['birthdate']
Out[278]:
'1974-10-01'
In [279]:
tdelta = arge.strftime('%Y-%m-%d') - currentime.strftime('%Y-%m-%d')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-279-f4aabccb4f5f> in <module>()
----> 1 tdelta = arge.strftime('%Y-%m-%d') - currentime.strftime('%Y-%m-%d')

TypeError: unsupported operand type(s) for -: 'str' and 'str'
In [280]:
from datetime import datetime
s1 = thetraits['traits']['birthdate']
s2 = currentime.strftime('%Y-%m-%d') # for example
FMT = '%Y-%m-%d'
tdelta = datetime.strptime(s2, FMT) - datetime.strptime(s1, FMT)
In [281]:
tdelta.days
Out[281]:
16076
In [282]:
present = arrow.now()
paseve = present.shift(days=-7)
futur = present.shift(days=7)
In [283]:
rewardays = list()
In [284]:
for r in arrow.Arrow.span_range('day', paseve, futur):
    #print(r.index)
    myr = r[0]
    print(myr.strftime('%m-%d'))
    rewardays.append(myr.strftime('%m-%d'))
09-29
09-30
10-01
10-02
10-03
10-04
10-05
10-06
10-07
10-08
10-09
10-10
10-11
10-12
10-13
In [285]:
bdayrew = arge.strftime('%m-%d') in rewardays
In [286]:
if bdayrew == True:
    print('its ya bday reward')
else:
    print('its not ya bday reward')
its ya bday reward
In [287]:
def checkbirth(dob):
    arge = arrow.get(dob, 'YYYY-MM-DD')
    bdayrew = arge.strftime('%m-%d') in rewardays
    if bdayrew == True:
        return('its ya bday reward')
    else:
        return('its not ya bday reward')
    
In [288]:
checkbirth('1988-10-01')
Out[288]:
'its ya bday reward'

customer_id string true A customer identifier of customer. If you dont have one you can use the persons device mac or

email address instead.

longitude float true Longitude of the identified device

latitude float true Latitude of the identified device

seen_at string true A datetime when the device was last seen. In the format of a RFC 3339 datetime ( 2017-11-29T08:09:57Z )

In [28]:
"longitude": 151.20919,
    "latitude": -33.88668,
    "seen_at": "2017-11-29T08:09:57Z"
  File "<ipython-input-28-ebff5005de0b>", line 1
    "longitude": 151.20919,
               ^
SyntaxError: invalid syntax
In [ ]:
 
In [29]:
def createlocation(email, longitude, latitude):
    return({'email' : email, 'longitude' : longitude, 'latitude' : latitude})
In [52]:
createlocation('hammers@gmail.com', '151.20919', '-33.88668')
Out[52]:
{'email': 'hammers@gmail.com',
 'latitude': '-33.88668',
 'longitude': '151.20919'}
In [53]:
import requests 
In [ ]:
https://api.opencagedata.com/geocode/v1/json?q=41.40139%2C%202.12870&key=9943e82b6c974d878ff290540e9b9835&language=en&pretty=1
In [70]:
requrl = requests.get('https://api.opencagedata.com/geocode/v1/json?q=151.20919%2C%-33.88668&key=9943e82b6c974d878ff290540e9b9835&language=en&pretty=1')
In [71]:
requrl.json()
Out[71]:
{'documentation': 'https://opencagedata.com/api',
 'licenses': [{'name': 'CC-BY-SA',
   'url': 'https://creativecommons.org/licenses/by-sa/3.0/'},
  {'name': 'ODbL',
   'url': 'https://opendatacommons.org/licenses/odbl/summary/'}],
 'rate': {'limit': 2500, 'remaining': 2486, 'reset': 1539129600},
 'results': [],
 'status': {'code': 200, 'message': 'OK'},
 'stay_informed': {'blog': 'https://blog.opencagedata.com',
  'twitter': 'https://twitter.com/opencagedata'},
 'thanks': 'For using an OpenCage Data API',
 'timestamp': {'created_http': 'Tue, 09 Oct 2018 06:45:57 GMT',
  'created_unix': 1539067557},
 'total_results': 0}
In [49]:
import requests
url = 'https://maps.googleapis.com/maps/api/geocode/json'
params = {'sensor': 'false', 'address': 'Mountain View, CA'}
r = requests.get(url, params=params)
results = r.json()['results']
In [142]:
somejs = {
    "customer_id": "00:14:22:01:23:45",
    "profiles": [{
        "source": "facebook",
        "source_id": "87235872",
        "link": "https://facebook.com/john.smith.478653",
    },{
        "source": "instagram",
        "source_id": "245986569842",
        "link": "https://instagram.com/johnsmith/",
    },{
        "source": "twitter",
        "source_id": "72735729779824",
        "link": "https://twitter.com/johnnysmiddy/",
    },{
        "source": "salesforce",
        "source_id": "37461828371",
        "link": "https://salesforce.com/customer/abc123"
    }],
    "traits": {
        "first_name": "John",
        "last_name": "Smith",
        "email": "john.smith@gmail.com",
        "loyalty_level": "Elite Plus",
        "loyalty_number": "AU8759342",
        "birthdate": "1974-08-01",
        "gender": "male",
        "marketing_consent": True
    }
  }
In [146]:
with open('/home/wcmckee/local.json', 'w') as locwr:
    locwr.write(json.dumps(somejs))
In [147]:
with open('/home/wcmckee/local.json', 'r') as locrd:
    locrd.read()
In [148]:
cat /home/wcmckee/local.json
{"profiles": [{"source_id": "87235872", "link": "https://facebook.com/john.smith.478653", "source": "facebook"}, {"source_id": "245986569842", "link": "https://instagram.com/johnsmith/", "source": "instagram"}, {"source_id": "72735729779824", "link": "https://twitter.com/johnnysmiddy/", "source": "twitter"}, {"source_id": "37461828371", "link": "https://salesforce.com/customer/abc123", "source": "salesforce"}], "customer_id": "00:14:22:01:23:45", "traits": {"birthdate": "1974-08-01", "gender": "male", "marketing_consent": true, "loyalty_level": "Elite Plus", "email": "john.smith@gmail.com", "last_name": "Smith", "loyalty_number": "AU8759342", "first_name": "John"}}
In [153]:
creatime = {
  "data": [
    {
      "created_time": "2017-12-08T01:08:57+0000",
      "message": "Love this puzzle. One of my four coke puzzles",
      "id": "820882001277849_1805191182846921"
    },
    {
      "created_time": "2017-12-07T20:06:14+0000",
      "message": "You need to add grape as a flavor for Coke in your freestyle machines.",
      "id": "820882001277849_1804966026202770"
    },
    {
      "created_time": "2017-12-07T01:29:12+0000",
      "message": "Plz play the old commercial’s with the polar bears. Would be nice to see them this holiday",
      "id": "820882001277849_1804168469615859"
    }
  ]
}
In [150]:
automshrply = 'thank you for the comment. this is an auto responce to let you know we have seen it.'
In [239]:
commenturl = 'https://graph.facebook.com/{}/comments?message={}'.format(creatime['data'][crdata]['id'], automshrply)
In [240]:
commenturl
Out[240]:
'https://graph.facebook.com/820882001277849_1804168469615859/comments?message=thank you for the comment. this is an auto responce to let you know we have seen it.'
In [163]:
for crdata in range(0, len(creatime['data'])):
    print(creatime['data'][crdata])
    
    print(creatime['data'][crdata]['id'])
    #creatime['data']:
    #print(creatime['data'])
    automshrply = 'thank you for the comment. this is an auto responce to let you know we have seen it.'
    commenturl = 'https://graph.facebook.com/{}/comments?message={}'.format(creatime['data'][crdata]['id'], automshrply)
    print(commenturl)
{'message': 'Love this puzzle. One of my four coke puzzles', 'id': '820882001277849_1805191182846921', 'created_time': '2017-12-08T01:08:57+0000'}
820882001277849_1805191182846921
https://graph.facebook.com/820882001277849_1805191182846921/comments?message=thank you for the comment. this is an auto responce to let you know we have seen it.
{'message': 'You need to add grape as a flavor for Coke in your freestyle machines.', 'id': '820882001277849_1804966026202770', 'created_time': '2017-12-07T20:06:14+0000'}
820882001277849_1804966026202770
https://graph.facebook.com/820882001277849_1804966026202770/comments?message=thank you for the comment. this is an auto responce to let you know we have seen it.
{'message': 'Plz play the old commercial’s with the polar bears. Would be nice to see them this holiday', 'id': '820882001277849_1804168469615859', 'created_time': '2017-12-07T01:29:12+0000'}
820882001277849_1804168469615859
https://graph.facebook.com/820882001277849_1804168469615859/comments?message=thank you for the comment. this is an auto responce to let you know we have seen it.
In [72]:
reqfb = requests.get('https://api.meetup.com/self/calendar?photo-host=public&page=20&sig_id=58828502&sig=dcef242c3502d7e7e1f9129220507cb1f31ba3ec')
In [77]:
reqjs = (reqfb.json())
In [100]:
meetlen = len(reqjs)
In [101]:
print(meetlen)
20
In [118]:
for met in range(0, meetlen):
    try:
        print(reqjs[met])
        print(reqjs[met]['venue'])
    except KeyError:
        #pass
        print('error key not found')
{'duration': 9000000, 'name': 'Rescheduled October Data Engineering Meetup, Sydney', 'status': 'upcoming', 'venue': {'name': 'Airtasker', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 3, 71 York St', 'lon': 151.2057342529297, 'id': 25791855, 'localized_country_name': 'Australia', 'lat': -33.86824417114258, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-10', 'group': {'who': 'Data Engineers', 'name': 'Sydney Data Engineering Meetup', 'urlname': 'Sydney-Data-Engineering-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 26144847, 'created': 1507081242000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539154800000, 'link': 'https://www.meetup.com/Sydney-Data-Engineering-Meetup/events/255260041/', 'local_time': '18:00', 'yes_rsvp_count': 100, 'utc_offset': 39600000, 'description': '<p>Airtasker have kindly offered to host us this month.</p> <p>We have 3 awesome speakers:<br/>- Dan Gooden<br/>- Claire Carroll<br/>- Nick Wienholt</p> <p>******************************</p> <p>1st Talk - Dan Gooden:<br/>Testing Patterns in Code Driven SQL Data Pipelines<br/>Consistent and automated testing builds confidence in datasets, catches change in upstream systems, and ensures reliability so you can build more complex models safely.</p> <p>In this talk I\'ll cover ideas I\'ve developed over the past few years about useful testing patterns in fast moving, small data teams writing code driven SQL pipelines.</p> <p>Dan Gooden is the Data Lead at Airtasker, where he is responsible for ensuring the company leverages data internally to discover valuable insights, and externally for the benefit of its users of our platform. He has a keen interest in ensuring data has a meaningful relationship to the activities that companies undertake in the world.<br/>Before Airtasker, Dan worked for the Domain Group as the Data Engineering Platform Lead, where he was responsible for creating and managing a team that built the data warehouse. Prior to that he contracted for many years in the DW &amp; BI space.</p> <p>******************************</p> <p>2nd Talk - Claire Carroll<br/>Sharing beautiful data documentation<br/>One of the hardest parts of building a data-driven culture is making sure everyone is speaking the same language – in essence, answering the question “what does this number mean, and where does it come from?”<br/>Attempts to share this knowledge usually come in the form of building a “databook”, either built as a bespoke solution, or by using off the shelf products like Confluence.<br/>In this talk, I’m going to demonstrate how open source tool dbt has solved this problem.<br/>---<br/>Claire is a Data Analyst at Airtasker, and Community Manager for dbt.</p> <p>******************************</p> <p>3rd Talk - Nick Wienholt:<br/>Designing and implementing an automated trading system based on many disparate data sources, using multiple machine learning models and executing across multiple exchanges is an interesting engineering challenge, and one in with reference architectures are very much at the embryonic stage.<br/>In this presentation, Nick will present a complete architecture based on a number of open-source tools including Redis, Kafka and Spark, and examine a number of the possible design approaches.</p> <p>Nick is a consulting data and quantitive engineering based in Sydney. With a focus on high volume trading systems based on machine learning and alternate data, Nick enjoys working with a variety of clients on both the buy- and sell-side in the financial market and gaming industry.</p> <p>******************************</p> <p>We have our own slack group and website which you can find out more details about here: <a href="https://sydneydataengineers.github.io/" class="linkified">https://sydneydataengineers.github.io/</a></p> ', 'waitlist_count': 21, 'rsvp_limit': 100, 'id': '255260041', 'created': 1538644634000, 'updated': 1538644634000}
{'name': 'Airtasker', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 3, 71 York St', 'lon': 151.2057342529297, 'id': 25791855, 'localized_country_name': 'Australia', 'lat': -33.86824417114258, 'country': 'au'}
{'duration': 14400000, 'name': 'Flutter Study Jam Session 2', 'status': 'upcoming', 'venue': {'name': 'Google offices @ Fairfax', 'repinned': True, 'city': 'Sydney', 'address_1': ' 2/1 Darling Island Rd, Pyrmont NSW 2009', 'lon': 151.19580078125, 'id': 16805462, 'localized_country_name': 'Australia', 'lat': -33.864994049072266, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-10', 'group': {'who': 'Members', 'name': 'GDG Sydney', 'urlname': 'gdgsydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 1955151, 'created': 1306990800000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539154800000, 'link': 'https://www.meetup.com/gdgsydney/events/254578726/', 'local_time': '18:00', 'yes_rsvp_count': 63, 'utc_offset': 39600000, 'description': '<p>Flutter is Google’s mobile app SDK for crafting high-quality native interfaces on iOS and Android in record time. Flutter works with existing code, is used by developers and organizations around the world, and is free and open source.</p> <p>Following our \'Getting Ready to Flutter’ meetup, we are stepping up our game and we’re starting a full-fledged series of Flutter Study Jam sessions.<br/>These sessions are prepared by the Flutter team at Google and start off with the *very* basics of Flutter. So if you\'ve never used Flutter before, come join in. If you have some basic knowledge, it might not be the best use of your time :)<br/>Keep in mind that this is a hands-on session, so not many spots are available. Make sure you can attend before you RSVP so you give other people a chance.</p> <p>Bring a laptop! If you want to hit the ground running, make sure you have Flutter installed on your laptop prior to starting the Study Jam. Windows/Linux/Mac are all fine. Check out : <a href="https://flutter.io/get-started/install/" class="linkified">https://flutter.io/get-started/install/</a></p> <p>This is Session 2 of 3 and each session builds on the previous<br/>Session 1: <a href="https://www.meetup.com/gdgsydney/events/254578527/" class="linkified">https://www.meetup.com/gdgsydney/events/254578527/</a><br/>Session 3: <a href="https://www.meetup.com/gdgsydney/events/254578753/" class="linkified">https://www.meetup.com/gdgsydney/events/254578753/</a></p> <p>Agenda:</p> <p>Zarah: Welcome!</p> <p>Quirijn &amp; Brett: Kicking off the Flutter Study jam part 2!</p> <p>Break</p> <p>Quirijn &amp; Brett: More Flutter!</p> <p>Thank you\'s, hugs and goodbye\'s</p> ', 'waitlist_count': 0, 'rsvp_limit': 76, 'pro_is_email_shared': True, 'id': '254578726', 'created': 1536660672000, 'updated': 1539081976000, 'how_to_find_us': 'Proceed to One Darling Island Road (the Domain/Fairfax building), security will check you off a list then beep you in through the gates and up the lift to level 2.'}
{'name': 'Google offices @ Fairfax', 'repinned': True, 'city': 'Sydney', 'address_1': ' 2/1 Darling Island Rd, Pyrmont NSW 2009', 'lon': 151.19580078125, 'id': 16805462, 'localized_country_name': 'Australia', 'lat': -33.864994049072266, 'country': 'au'}
{'duration': 10800000, 'name': 'Designing for Fintech and Financial Empowerment!', 'status': 'upcoming', 'venue': {'name': 'Academy Xi', 'repinned': True, 'city': 'Sydney', 'address_1': '48 Druitt St', 'lon': 151.2044677734375, 'id': 25646241, 'localized_country_name': 'Australia', 'lat': -33.87266159057617, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-10', 'group': {'who': 'Members', 'name': 'Sydney Designers', 'urlname': 'Sydney-Designers-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 22933614, 'created': 1489905312000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Sydney-Designers-Meetup/events/254869658/', 'local_time': '18:00', 'yes_rsvp_count': 139, 'utc_offset': 39600000, 'description': '<p>Design has evolved since the days of the ATM. From splitting bills to checking our account balance—designers work to improve our financial lives, and provide us the freedom to make financial decisions at the touch of a button.</p> <p>Join the best designers from the fintech and finance world as they talk about the challenges of creating a first class customer experience, and discuss all the unseen challenges of complying with numbers and regulations. Finance is ripe with design opportunities—learn from these war stories and victories!</p> ', 'waitlist_count': 0, 'id': '254869658', 'time': 1539154800000, 'created': 1537490162000, 'updated': 1537490611000}
{'name': 'Academy Xi', 'repinned': True, 'city': 'Sydney', 'address_1': '48 Druitt St', 'lon': 151.2044677734375, 'id': 25646241, 'localized_country_name': 'Australia', 'lat': -33.87266159057617, 'country': 'au'}
{'duration': 10800000, 'name': 'Meet up with your analytics peers and chat', 'status': 'upcoming', 'venue': {'name': 'Mr Tipplys ', 'repinned': True, 'city': 'Sydney', 'address_1': '347 Kent Street, Sydney NSW 2000 ', 'lon': 151.2041473388672, 'id': 24591303, 'localized_country_name': 'Australia', 'lat': -33.868370056152344, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-10', 'group': {'who': 'Analysts', 'name': 'Web Analytics Wednesday Sydney', 'urlname': 'Web-Analytics-Wednesday-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 13894792, 'created': 1397440322000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Web-Analytics-Wednesday-Sydney/events/fgqcqpyxnbnb/', 'local_time': '18:30', 'yes_rsvp_count': 35, 'utc_offset': 39600000, 'description': "<p>Every second Wednesday of the month the digital analytics community gets together for one or two short talks in an informal setting. There's lots of time for open ended discussion and to socialise. Plus there's free drinks!</p> <p>Open to anyone interested in digital analytics, we have people ranging from beginners through to analytics gurus and from marketing through to technical spaces, and everything in between.</p> <p>Web Analytics Wednesday is a great person to learn about what's happening in the digital analytics field, meet your peers and solve problems together.</p> <p>=========<br/>This Month our talks are:<br/>Brian Do, Datalicious - Custom Funnel Reporting in Google Analytics</p> <p>Panel - How to Find and Grow Good Analytics Talent</p> ", 'waitlist_count': 0, 'id': 'fgqcqpyxnbnb', 'time': 1539156600000, 'created': 1524452619000, 'updated': 1538970947000, 'how_to_find_us': 'Upstairs on the first floor'}
{'name': 'Mr Tipplys ', 'repinned': True, 'city': 'Sydney', 'address_1': '347 Kent Street, Sydney NSW 2000 ', 'lon': 151.2041473388672, 'id': 24591303, 'localized_country_name': 'Australia', 'lat': -33.868370056152344, 'country': 'au'}
{'duration': 9000000, 'name': 'SydPWA October 2018', 'status': 'upcoming', 'venue': {'name': 'SiteMinder', 'repinned': True, 'city': 'Sydney', 'address_1': 'Ground Floor, 88 Cumberland St, The Rocks NSW 2000', 'lon': 151.2075958251953, 'id': 25500141, 'localized_country_name': 'Australia', 'lat': -33.85821533203125, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-11', 'group': {'who': 'Members', 'name': 'Sydney Progressive Web Apps | SydPWA', 'urlname': 'SydPWA', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 20454233, 'created': 1474544374000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539241200000, 'link': 'https://www.meetup.com/SydPWA/events/254438926/', 'local_time': '18:00', 'yes_rsvp_count': 87, 'utc_offset': 39600000, 'description': '<p>PWA is becoming a major hot topic in web development recently. The SiteMinder and WebDirections are here to back you up with some tech details, news, and food.</p> <p>This time we have two topic:<br/>* Next Generation mobile retail with PWA and AMP<br/>* 0 to PWA in minutes (convert an existing site into a PWA)</p> <p>Sponsors:<br/>* Forever awesome SiteMinder which buys food 🍕 and is also our host 🏢 for the night. <a href="https://www.siteminder.com/" class="linkified">https://www.siteminder.com/</a><br/>* The prominent WebDirection conferences will get you some drinks 🍺. And we\'ve got some discounts to their upcoming event 🎉. <a href="https://www.webdirections.org/wds/" class="linkified">https://www.webdirections.org/wds/</a></p> <p>Talk 1️⃣</p> <p>Time: 6:30 - 7:15pm</p> <p>Title: Next Generation mobile retail with PWA and AMP</p> <p>Info: How brands and retailers are leveraging the Progressive Web Apps to increase their mobile revenue and customer engagement. And some tech stuff that makes it all possible</p> <p>Presenters:<br/>Dean Maslic - Founder and Principal at Commerce Right, veteran of the mobile web and ecommerce solution expert<br/>James Semple - Lead Solutions Engineer at Mobify, mobile commerce evangelist, solution architect and thought leader</p> <p>Background:<br/>Commerce Right is a boutique consulting firm helping brands and retailers deliver omni-channel commerce solutions. Based in Sydney it is the only Mobify Implementation Partner in Australia<br/>Mobify is a digital experience platform for building modern, customer-first shopping experiences through Progressive Web Apps (PWA), Accelerated Mobile Pages (AMP), and native apps. Established in 2007, Mobify is headquartered in Vancouver, Canada</p> <p>Talk 2️⃣</p> <p>Time: 7:30pm - 8:00pm</p> <p>Title: 0 to PWA in minutes</p> <p>Info: A lightning journey on the fundamentals required to convert an existing site into a PWA.</p> <p>Presenter:<br/>Marcin Piekarski - Started learning how to build sites back in 1998 using Netscape and a text editor. Have worked on projects for companies big and small, including Carsguide, CBA, etc. With my most recent project being the implementation of a PWA on the Harvey Norman, Domayne and Joyce Mayne websites.</p> <p>Background:<br/>Harvey Norman is Harvey Norman.</p> <p>See ya there, friends!</p> ', 'waitlist_count': 0, 'rsvp_limit': 100, 'id': '254438926', 'created': 1536225867000, 'updated': 1538735083000}
{'name': 'SiteMinder', 'repinned': True, 'city': 'Sydney', 'address_1': 'Ground Floor, 88 Cumberland St, The Rocks NSW 2000', 'lon': 151.2075958251953, 'id': 25500141, 'localized_country_name': 'Australia', 'lat': -33.85821533203125, 'country': 'au'}
{'duration': 9900000, 'name': 'Fitness Friday Night Pitches', 'status': 'upcoming', 'venue': {'name': 'Fishburners', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 2/3 11-31 York St,', 'lon': 151.20526123046875, 'id': 25668721, 'localized_country_name': 'Australia', 'lat': -33.86531066894531, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-12', 'group': {'who': 'Entrepreneurs', 'name': 'Fishburners Meetup', 'urlname': 'Fishburners-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 17306242, 'created': 1412058644000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Fishburners-Meetup/events/cgglfqyxnbqb/', 'local_time': '17:15', 'yes_rsvp_count': 93, 'utc_offset': 39600000, 'description': "<p>Every Friday night from 5:15pm, Fishburners opens its doors to host startup community pitches and networking!</p> <p>If you're looking for inspiration to start a business, learn some pitch tips or just find out what new tech startups are happening in Sydney, this event is for you.</p> <p>Please note that Fishburners values inclusive communities and all events hosted here are governed by our community code of conduct. This stems from our desire to run a productive and valuable night for our founders and all attendees where everyone feels safe and welcome to attend, bring friends, meet new people, and enjoy the start up landscape.</p> <p>To make sure we have the best environment in support of this, in cases where someone is detracting from this goal they may be asked to leave for excessive drinking, antisocial behaviour, speaking during the pitches or disengagement with the purpose of the night. If any issues arise that make you feel uncomfortable please don’t hesitate to come and speak to one of the team.</p> <p>We hope that by this everyone will enjoy an even more energetic and exciting event, continuing to grow in numbers as we focus all our considerable resources on growing and supporting the skills and passions of us all in the start up industry.</p> <p>The event schedule for Friday Night Pitches is as follows:</p> <p>• 5:15PM: Networking &amp; drinks</p> <p>• 5:30PM: Pitches begin. Grab a seat! No talking during this time :)</p> <p>• 6:30PM-8PM (approx.): Networking</p> <p>• 8PM: Event concludes</p> <p>See you soon!</p> ", 'waitlist_count': 0, 'id': 'cgglfqyxnbqb', 'time': 1539324900000, 'created': 1518060109000, 'updated': 1538539613000, 'how_to_find_us': 'Come up to Level 3 in the lifts'}
{'name': 'Fishburners', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 2/3 11-31 York St,', 'lon': 151.20526123046875, 'id': 25668721, 'localized_country_name': 'Australia', 'lat': -33.86531066894531, 'country': 'au'}
{'duration': 5400000, 'name': 'Read and chat', 'status': 'upcoming', 'venue': {'name': 'Location TBC', 'repinned': False, 'city': 'Sydney', 'address_1': 'TBC, Sydney', 'lon': 151.20689392089844, 'id': 12231022, 'localized_country_name': 'Australia', 'lat': -33.87364959716797, 'country': 'au'}, 'visibility': 'public_limited', 'local_date': '2018-10-14', 'rsvp_close_offset': 'PT1H30M', 'group': {'who': 'bibliophagists', 'name': 'Warm Brew and Reading Crew - Sydney', 'urlname': 'Warm-Brew-Reading-Crew-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 20004787, 'created': 1464404904000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'approval', 'timezone': 'Australia/Sydney'}, 'time': 1539477000000, 'link': 'https://www.meetup.com/Warm-Brew-Reading-Crew-Sydney/events/251980213/', 'local_time': '11:30', 'yes_rsvp_count': 15, 'utc_offset': 39600000, 'description': "<p>Come along and discuss what you've been reading, and enjoy some good company and good conversation! As always, bring a book and an open mind.</p> <p>Happy to take suggestions for read and chat locations - let me know in the comments :)</p> ", 'waitlist_count': 7, 'rsvp_limit': 15, 'id': '251980213', 'created': 1529558013000, 'updated': 1529558065000, 'how_to_find_us': 'More details will be posted closer to Meetup date'}
{'name': 'Location TBC', 'repinned': False, 'city': 'Sydney', 'address_1': 'TBC, Sydney', 'lon': 151.20689392089844, 'id': 12231022, 'localized_country_name': 'Australia', 'lat': -33.87364959716797, 'country': 'au'}
{'duration': 7200000, 'name': 'Algorithms, Graphs and Awesome Procedures', 'status': 'upcoming', 'visibility': 'public', 'local_date': '2018-10-15', 'group': {'who': 'Graphistas', 'name': 'GraphDB Sydney', 'urlname': 'GraphDB-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 8031902, 'created': 1365761897000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539588600000, 'link': 'https://www.meetup.com/GraphDB-Sydney/events/wfjtzpyxnbtb/', 'local_time': '18:30', 'yes_rsvp_count': 56, 'utc_offset': 39600000, 'description': "<p>After a festival season, it's time to reunion! This year, there will be a lot more to expect. Our Sydney meetup will be regular event held at our partner's venue in Sydney CBD. There will be more guest speakers, case studies, product shows, and of course food and other fun stuff.</p> <p>(graphs) -[:are]-&gt; (everywhere)</p> ", 'waitlist_count': 0, 'rsvp_limit': 150, 'id': 'wfjtzpyxnbtb', 'created': 1517792751000, 'updated': 1531893162000}
error key not found
{'duration': 9000000, 'name': 'Voice - the interface of the future', 'status': 'upcoming', 'venue': {'name': 'Deloitte', 'repinned': True, 'city': 'Sydney', 'address_1': 'Grosvenor Place, Level 9, 225 George Street, Sydney, NSW, 2000, Australia Sydney', 'lon': 151.20733642578125, 'id': 1682781, 'localized_country_name': 'Australia', 'lat': -33.86573028564453, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-16', 'group': {'who': 'Disruptors', 'name': 'Disruptors in Tech', 'urlname': 'Disruptors-in-Tech', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 19155708, 'created': 1448508113000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539671400000, 'link': 'https://www.meetup.com/Disruptors-in-Tech/events/fjwqtpyxmbpb/', 'local_time': '17:30', 'yes_rsvp_count': 229, 'utc_offset': 39600000, 'description': '<p>You need your Eventbrite ticket to attend: <a href="https://www.eventbrite.com.au/e/voice-the-interface-of-the-future-tickets-48151202543" class="linkified">https://www.eventbrite.com.au/e/voice-the-interface-of-the-future-tickets-48151202543</a></p> <p>Session 1<br/>The Return of the Voice Interface – why voice is making a big comeback<br/>Technology is driving disruption across many industries inspiring new business models and reinventing the way consumers and service providers interact. It is not that long ago that businesses and service providers strongly directed consumers towards a web based service model and then again towards a mobile based engagement. Now, the “good old” voice interaction is making a comeback! In this session we will we will look at recent trends that brought voice back to the centre of the stage and what that may mean for the future</p> <p>Zack Levy, Partner | DevOps &amp; Automation, Deloitte Australia<br/>Zack has more than 25 of experience in the ICT industry with corporations in Australia and internationally spanning from software development, data centre environments and in particular, cloud technologies. He is also well-trained with a combination of technical and commercial expertise. Zack is passionate when it comes to technology, it is his profession and hobby. He is a big believer in cloud platforms and excited to be part of today’s digital transformation.</p> <p>Session 2<br/>Cognitive Customer Experience (CX)<br/>Philip will demonstrate how AWS is progressively using AI and Machine Learning enabled technologies to expand the ways in which their customers deliver improved CX. He will talk through how Voice is evolving into the new CX interface of preference, and how you can think of new and inventive ways to delight your customers.</p> <p>Philip Zammit, Amazon Connect, Amazon Web Services<br/>Phillip is an experienced business executive with deep domain experience and expertise in the Customer Experience, Contact Centre and Customer Service industry for over 20 years. With a focus on innovation and customer outcomes, Phillip has developed deep engagements across many industries and a track record of quantifiable results.</p> <p>Session 3<br/>The practical aspects of implementing voice services<br/>A discussion on the transition of business from a visual web content paradigm to a natural speaking based conversational experience. It will focus on how to leverage existing web content and infrastructure to create the building blocks that can then be used to facilitate complex yet simple voice user experiences and facilitate transactions.</p> <p>Simon Horne, CEO Alkira Software<br/>Simon is CEO of Alkira Software an innovative conversational commerce technology company that focuses on the transition from visual lead web content to an audio based brand experience. Simon personally is an experienced entrepreneur and angel investor with more then 15 years international startup experience having started a number of businesses in Asia and more recently in the US. The most successful was the silicon valley startup BlueJeans which he joined as employee #1 and helped create the business idea and form the foundation team in 2009.</p> <p>Agenda: (Please arrive before 6 PM to start on time)</p> <p>5.45 PM - Drinks will be served<br/>6.00 PM - 6.30 PM - Session 1<br/>6.30 PM - 7.00 PM - Session 2<br/>7.00 PM - 7.30 PM - Break<br/>7.30 PM - 8:00 PM - Session 3</p> <p>Feel free to share event details, pictures and learnings and tag #DisruptorsInTech</p> <p>See you soon!</p> ', 'waitlist_count': 24, 'rsvp_limit': 20, 'id': 'fjwqtpyxmbpb', 'created': 1487033090000, 'updated': 1538528674000}
{'name': 'Deloitte', 'repinned': True, 'city': 'Sydney', 'address_1': 'Grosvenor Place, Level 9, 225 George Street, Sydney, NSW, 2000, Australia Sydney', 'lon': 151.20733642578125, 'id': 1682781, 'localized_country_name': 'Australia', 'lat': -33.86573028564453, 'country': 'au'}
{'duration': 9000000, 'name': 'Monthly Meetup - October', 'status': 'upcoming', 'venue': {'name': 'Atlassian', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 6, 341 George Street', 'lon': 151.2065887451172, 'id': 24353819, 'localized_country_name': 'Australia', 'lat': -33.86717987060547, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-16', 'group': {'who': 'Developers', 'name': 'Android Australia User Group - Sydney', 'urlname': 'Android-Australia-User-Group-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 1954971, 'created': 1306988708000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539673200000, 'link': 'https://www.meetup.com/Android-Australia-User-Group-Sydney/events/255359341/', 'local_time': '18:00', 'yes_rsvp_count': 39, 'utc_offset': 39600000, 'description': '<p>• What we\'ll do<br/>Thanks to our generous sponsors Atlassian for our venue, pizzas and drinks.</p> <p>Doors open at 6pm, and we\'ll start talks 6.30pm:</p> <p>We\'re be back at our usual place, at 341 George street.</p> <p>This month Orhan Obut from Atlassian will be sharing his experience on what it\'s like working as a platform developer</p> <p>"Working on an application is one thing, but working entirely on libraries (components) that are consumed by applications is another. In this talk, I’ll share my experience of being a platform developer and practices we follow."</p> <p>And Indrajit Chakrabarty has kindly volunteered to give us a "Recap of KotlinConf 2018"!</p> <p>We ask that attendees please take note of the recently published Code of Conduct (<a href="http://bit.ly/AndroidCoC" class="linkified">http://bit.ly/AndroidCoC</a>) on the About Us section of this meetup page (and pinned on the #android Slack (<a href="http://bit.ly/view-src" class="linkified">http://bit.ly/view-src</a>) channel)</p> <p>When you arrive, the lifts will be locked, but some very kind Atlassian employees will be there to let us up to level 6. Please try and arrive by 6.20pm so that they can catch the talks from 6.30pm. If you are running late, there will be a mobile number you can call from the lobby, but if you can, please try and avoid that so that our lovely hosts can see all the talks :)</p> <p>Questions or suggestions, please email sydneyaaug@gmail.com, ping @zarah or @ne\'mi a direct message on Slack (<a href="http://view-source-radboats.herokuapp.com/" class="linkified">http://view-source-radboats.herokuapp.com/</a>)</p> ', 'waitlist_count': 0, 'rsvp_limit': 100, 'id': '255359341', 'created': 1538994454000, 'updated': 1539059288000, 'how_to_find_us': 'We are back at our usual place. Enter through the main entrance of Westpack'}
{'name': 'Atlassian', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 6, 341 George Street', 'lon': 151.2065887451172, 'id': 24353819, 'localized_country_name': 'Australia', 'lat': -33.86717987060547, 'country': 'au'}
{'duration': 5400000, 'name': 'From App Idea to Funded Startup', 'status': 'upcoming', 'venue': {'name': 'CUB Business Club', 'repinned': True, 'city': 'Sydney', 'address_1': '3 Kings Cross Road', 'lon': 151.22398376464844, 'id': 25294756, 'localized_country_name': 'Australia', 'lat': -33.876190185546875, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-17', 'group': {'who': 'Founders', 'name': 'From App Idea to Funded Startup', 'urlname': 'Have-an-idea-for-an-app', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 19507156, 'created': 1454580687000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539757800000, 'link': 'https://www.meetup.com/Have-an-idea-for-an-app/events/255357393/', 'local_time': '17:30', 'yes_rsvp_count': 12, 'utc_offset': 39600000, 'description': "<p>Got the next great app idea but you’re not sure how to get it off the ground? Join our free meetup and let's discuss how to validate a business idea, fund a startup and turn it into a successful tech company. Nibbles and drinks are on us. :)</p> ", 'waitlist_count': 0, 'rsvp_limit': 30, 'id': '255357393', 'created': 1538983312000, 'updated': 1539068373000}
{'name': 'CUB Business Club', 'repinned': True, 'city': 'Sydney', 'address_1': '3 Kings Cross Road', 'lon': 151.22398376464844, 'id': 25294756, 'localized_country_name': 'Australia', 'lat': -33.876190185546875, 'country': 'au'}
{'duration': 10800000, 'name': 'SydJS.S — Showcase', 'status': 'upcoming', 'venue': {'name': 'Atlassian Headquarters', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 6, 341 George St', 'lon': 151.20692443847656, 'id': 9682622, 'localized_country_name': 'Australia', 'lat': -33.86726760864258, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-17', 'group': {'who': 'Members', 'name': 'SydJS.S', 'urlname': 'SydJS-S', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 24557686, 'created': 1497904662000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539759600000, 'link': 'https://www.meetup.com/SydJS-S/events/cslqcqyxnbwb/', 'local_time': '18:00', 'yes_rsvp_count': 65, 'utc_offset': 39600000, 'description': '<p><img src="https://secure.meetupstatic.com/photos/event/5/5/3/4/600_465441812.jpeg" /></p> <p>The SydJS.S Showcase meeting series is designed to introduce Companies and Teams active in the Sydney JavaScript community to the Community at large.</p> <p>Each month, we\'ll showcase the people making changes to the Web we work with. Keen to find out how teams are working? Want to meet a culture to see if you\'d be a good match? You need to jin us at SydJS.S</p> <p>On the night you can meet and learn from some of Sydney\'s finest.</p> ', 'waitlist_count': 0, 'rsvp_limit': 150, 'id': 'cslqcqyxnbwb', 'created': 1510567820000, 'updated': 1534253757000}
{'name': 'Atlassian Headquarters', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 6, 341 George St', 'lon': 151.20692443847656, 'id': 9682622, 'localized_country_name': 'Australia', 'lat': -33.86726760864258, 'country': 'au'}
{'duration': 7200000, 'name': 'SydJS', 'status': 'upcoming', 'venue': {'name': 'Atlassian Headquarters', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 6, 341 George St', 'lon': 151.20692443847656, 'id': 9682622, 'localized_country_name': 'Australia', 'lat': -33.86726760864258, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-17', 'group': {'who': 'Members', 'name': 'SydJS: Classic', 'urlname': 'SydJS-Classic', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 26631779, 'created': 1510987258000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539759600000, 'link': 'https://www.meetup.com/SydJS-Classic/events/grqcgqyxnbwb/', 'local_time': '18:00', 'yes_rsvp_count': 54, 'utc_offset': 39600000, 'description': '<p>• What we\'ll do<br/>Every 4th Wednesday of the month you\'ll find us talking about what we\'re doing and what\'s happening around us in the world of JavaScript.</p> <p>• Important to know<br/><a href="https://sydjs.com/about#CoC" class="linkified">https://sydjs.com/about#CoC</a></p> ', 'waitlist_count': 0, 'rsvp_limit': 100, 'id': 'grqcgqyxnbwb', 'created': 1537203864000, 'updated': 1537203864000}
{'name': 'Atlassian Headquarters', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 6, 341 George St', 'lon': 151.20692443847656, 'id': 9682622, 'localized_country_name': 'Australia', 'lat': -33.86726760864258, 'country': 'au'}
{'duration': 9000000, 'name': 'Sydney Design Thinking Meetup #33: Design Thinking in Social Enterprises', 'status': 'upcoming', 'venue': {'name': 'ThoughtWorks', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 10, 50 Carrington Street', 'lon': 151.2065887451172, 'id': 25702956, 'localized_country_name': 'Australia', 'lat': -33.866329193115234, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-18', 'rsvp_close_offset': 'PT2H', 'group': {'who': 'Design Thinkers', 'name': 'Sydney Design Thinking Meetup', 'urlname': 'Sydney-Design-Thinking-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 18596799, 'created': 1431584824000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539846000000, 'link': 'https://www.meetup.com/Sydney-Design-Thinking-Meetup/events/249478643/', 'local_time': '18:00', 'yes_rsvp_count': 120, 'utc_offset': 39600000, 'description': '<p>Please join us for some thought provoking conversation with interesting people keen on design thinking. This month we\'re talking about design thinking in social enterprises. We\'ll have a couple of speakers followed by a panel.</p> <p>MEET THE SPEAKERS/PANEL<br/>**Julia Suh - Director of Small Shift**<br/>Julia is a leading voice in citizen-led urbanism, and specialises in applying human-centred design as a tool for social change and advocacy. Julia’s purpose is to support people to build a sense of belonging to their local places and community; and create a new kind of city-making narrative — one that includes people on the margin. Julia has taught and practiced architecture, placemaking and urban design in New York, Auckland, Hanoi and Sydney, building an extensive knowledge of various communities; and urban spaces that support or neglect them. In 2017, Julia was awarded Westpac Social Change Fellowship and has been backed by Westpac Bicentennial Foundation.</p> <p>Is top-down urban development killing our spirit, our innate ability to self-organise and improve our own lives and places? Julia spent her formative years in a 5,500-unit masterplanned ‘village’ in Seoul, seen neighbourliness shine in post-earthquake Christchurch, and worked with incredibly talented people who are experiencing homelessness and isolation in Sydney. To build community resilience, social trust and employment pathways, her bottom-up social enterprise Small Shift supports locals to reimagine and create public spaces together. Learn more about how she is taking the Small Shift model to disadvantaged areas, and contribute your thoughts on how we can create inclusive communities.</p> <p>**Bronte Hogarth - Founder of Raise The Bar**<br/>Bronte Hogarth is a social entrepreneur from Sydney. In 2017, she started Raise The Bar which diverts used coffee grounds from landfill by turning them into natural skincare products. Bronte recently completed a successful crowdfunding campaign to launch Raise The Bar and was one of the Foundation For Young Australian\'s Young Social Pioneers in 2017.</p> <p>Bronte will share some of her journey with starting Raise The Bar.</p> <p>**Kath Hamilton - Founder of loop+**<br/>Kath has 15+ years experience as a digital executive leading product, engineering, business development and marketing. Her extensive experience with blue-chip corporates such as Yahoo!7, News Corp, Telstra and Westfield now combines with her passion to build products that can radically improve the lives of others. loop+ was conceived to support the functional recovery of her nephew who sustained a spinal cord injury at birth.</p> <p>loop+ is an activity tracker for wheelchair users that monitors health risks in everyday life. The platform is comprised of a sensor pad which remains in the wheelchair connected to a mobile app and dashboard for remote clinical monitoring. For the first time, wheelchair users who either can’t feel their lower limbs or are non-verbal and unable to communicate their discomfort, have a way to visualise what’s going on with their body. Remote monitoring supports early detection and intervention of pressure wounds, respiratory issues and scoliosis.</p> <p>**Ben Pecotich - Founder/Design &amp; Innovation Director of Dynamic4**<br/>Ben is a designer, innovation coach, and social enterprise founder. In addition to Dynamic4, he\'s the CTO &amp; co-founder of Better Goals (<a href="http://bettergoals.com.au" class="linkified">http://bettergoals.com.au</a>), a social enterprise helping people with intellectual disability develop more independence. He\'s also a founder of the Sydney Design Thinking meetup.</p> <p>Dynamic4 (<a href="https://dynamic4.com" class="linkified">https://dynamic4.com</a>) is a purpose-driven design &amp; innovation company, and certified B Corp. They collaborate with people to design and build ideas for happier communities that are more empowered, inclusive, and sustainable. Jetpack for Changemakers (<a href="https://dynamic4.com/jetpack" class="linkified">https://dynamic4.com/jetpack</a>) is Dynamic4\'s coaching/incubator program for early stage social enterprises.</p> <p>EVENT SPONSOR<br/>Thanks to ThoughtWorks Sydney for hosting us and providing refreshments.</p> ', 'waitlist_count': 50, 'rsvp_limit': 120, 'id': '249478643', 'created': 1522975175000, 'updated': 1539084625000, 'how_to_find_us': 'Where all the buses are at Wynyard train station'}
{'name': 'ThoughtWorks', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 10, 50 Carrington Street', 'lon': 151.2065887451172, 'id': 25702956, 'localized_country_name': 'Australia', 'lat': -33.866329193115234, 'country': 'au'}
{'duration': 5400000, 'name': 'Casual Get Together Over Coffee, Tea or Brunch', 'status': 'upcoming', 'venue': {'name': 'UPPERROOM RESTCAFE', 'repinned': False, 'city': 'Sydney', 'address_1': '220 Pitt Street', 'lon': 151.2082977294922, 'id': 24154967, 'localized_country_name': 'Australia', 'lat': -33.87184143066406, 'country': 'au'}, 'visibility': 'public_limited', 'local_date': '2018-10-18', 'group': {'who': 'friends', 'name': 'Depression Anxiety Sydney', 'urlname': 'Depression-Anxiety-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 10312802, 'created': 1379459689000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Depression-Anxiety-Sydney/events/xfmvtpyxnbxb/', 'local_time': '11:00', 'yes_rsvp_count': 3, 'utc_offset': 39600000, 'description': "<p>This is a daytime occasion for members who can attend and it's held in the city about 2 minutes walk from Town Hall Station.</p> <p>At this stage this will be a self-run meetup for members and guests. RSVP only if you are really going to attend. Any member wishing to host the meetup on a regular basis is welcome to approach the group organizers and indicate their interest.</p> <p>Come out and meet with like-minded people where you can be yourself without fear of judgement. We have all been through the same stuff! Discussion is encouraged with the aim of helping each other work through what's troubling you.</p> ", 'waitlist_count': 0, 'id': 'xfmvtpyxnbxb', 'time': 1539820800000, 'created': 1528439251000, 'updated': 1528439251000, 'how_to_find_us': 'Look for other group members whose photos appear on on this meetup page.'}
{'name': 'UPPERROOM RESTCAFE', 'repinned': False, 'city': 'Sydney', 'address_1': '220 Pitt Street', 'lon': 151.2082977294922, 'id': 24154967, 'localized_country_name': 'Australia', 'lat': -33.87184143066406, 'country': 'au'}
{'duration': 7200000, 'name': 'October Serverless Meetup, Sydney', 'status': 'upcoming', 'venue': {'name': 'Versent', 'repinned': True, 'city': 'Sydney', 'address_1': "Level 6, 6-10 O'Connell Street", 'lon': 151.210205078125, 'id': 25861753, 'localized_country_name': 'Australia', 'lat': -33.8651123046875, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-18', 'group': {'who': 'Members', 'name': 'Sydney Serverless Meetup Group', 'urlname': 'Sydney-Serverless-Meetup-Group', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 19672958, 'created': 1457322583000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539846000000, 'link': 'https://www.meetup.com/Sydney-Serverless-Meetup-Group/events/251003001/', 'local_time': '18:00', 'yes_rsvp_count': 67, 'utc_offset': 39600000, 'description': '<p>Speak details are in:<br/>Rowan Udell will be doing the first talk - details to follow shortly.</p> <p>Simon Waight is the 2nd speaker for the night:<br/>Azure Serverless for Java Developers<br/>Come along and learn how to write, deploy and debug Java-based Azure Functions with the new v2 Azure Functions runtime. Learn about how you can build your own custom bindings for Functions to increase their utility in your environment.<br/>You can check out Simon\'s bio here: <a href="https://blog.siliconvalve.com/speaker-bio/" class="linkified">https://blog.siliconvalve.com/speaker-bio/</a></p> <p>This months Serverless meetup will be hosted at Versent.</p> ', 'waitlist_count': 0, 'rsvp_limit': 95, 'id': '251003001', 'created': 1526949412000, 'updated': 1536914000000, 'how_to_find_us': 'Take the elevator up to level 6.'}
{'name': 'Versent', 'repinned': True, 'city': 'Sydney', 'address_1': "Level 6, 6-10 O'Connell Street", 'lon': 151.210205078125, 'id': 25861753, 'localized_country_name': 'Australia', 'lat': -33.8651123046875, 'country': 'au'}
{'duration': 9000000, 'name': 'Ruby on Rails Development Hub', 'status': 'upcoming', 'venue': {'name': 'Airtasker', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 3, 71 York St', 'lon': 151.2057342529297, 'id': 25791855, 'localized_country_name': 'Australia', 'lat': -33.86824417114258, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-18', 'group': {'who': 'Rubyists', 'name': 'Ruby on Rails Oceania Sydney', 'urlname': 'Ruby-On-Rails-Oceania-Sydney', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 7610932, 'created': 1363232178000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'time': 1539846000000, 'link': 'https://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/events/wpttwpyxnbxb/', 'local_time': '18:00', 'yes_rsvp_count': 32, 'utc_offset': 39600000, 'description': '<p>What is it?</p> <p>This is a monthly Meetup event sponsored by reinteractive (<a href="http://www.reinteractive.net/" class="linkified">http://www.reinteractive.net/</a>) and Airtasker (<a href="https://www.airtasker.com" class="linkified">https://www.airtasker.com</a>), where you can bring along your laptop and get coding!</p> <p>No matter what your experience, from beginner to expert, professional Developers from the Community will be there to help you with whatever difficulty you are running into. Whether it be working through a step by step Rails tutorials, improving your app, or code writing tips, we\'ll be there to help you take your skills to the next level.</p> <p>Who can Attend?</p> <p>Anyone can attend the DevelopmentHub. From complete beginners to experienced users of Ruby on Rails, we welcome anyone who wants to come along to improve your Ruby on Rails skills or get advice on the Rails application of your dreams.</p> <p>For those of you who are fresh graduates of the InstallFest Meetup or have completed the first (<a href="http://railsinstallfest.org/guides/installfest/getting_started/" class="linkified">http://railsinstallfest.org/guides/installfest/getting_started/</a>) and second (<a href="http://railsinstallfest.org/guides/installfest/testing_the_blog/" class="linkified">http://railsinstallfest.org/guides/installfest/testing_the_blog/</a>) InstallFest blog posts in your own time, Community mentors will help walk you through the next series of articles to continue to develop your Ruby on Rails learning experience step by step.</p> <p>If you do not yet have a Rails development environment set up on your laptop, you might prefer to attend an InstallFest first. You can register for the next event here (<a href="https://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/events/244271091/" class="linkified">https://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/events/244271091/</a>).</p> <p>What does it cost?</p> <p>Nothing :) It\'s free!</p> <p>Will there be food?</p> <p>Yes, we are organising pizza and drinks for you to enjoy while you are working on tough Ruby on Rails problems!</p> <p>I\'m a professional Rails Developer, can I help mentor?</p> <p>Yes, we are always looking for experts to come along and help. Just send us an email (training@reinteractive.net) or RSVP to let us know you\'d like to come along.</p> <p>What have previous attendees said about Development Hub?</p> <p>“Very impressed with the way it was run, gained invaluable experience.” - Jurgens Smit (<a href="http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/91585442/" class="linkified">http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/91585442/</a>)</p> <p>“It’s really good and quite surprising that they do it for free” - Rudy Lee (<a href="http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/29671012/" class="linkified">http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/29671012/</a>)</p> <p>“Enjoyed the opportunity to build on the intro to Ruby on Rails provided at Installfest. It was great to get expert help regarding issues I had working through the subsequent reInteractive blogs. Solving the issues would have been much harder without the assistance of Mikel and his team.” - Eddie Gock (<a href="http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/88708892/" class="linkified">http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/88708892/</a>)</p> <p>"That was great. I did some more coding afterwards and just the couple of questions I had answered made me more confident of my coding and happier it was going in the right direction. Very helpful. Thanks everyone." - Glenn Morrow (<a href="http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/92710342/" class="linkified">http://www.meetup.com/Ruby-On-Rails-Oceania-Sydney/members/92710342/</a>)</p> ', 'waitlist_count': 0, 'rsvp_limit': 80, 'id': 'wpttwpyxnbxb', 'created': 1530166235000, 'updated': 1538024492000}
{'name': 'Airtasker', 'repinned': False, 'city': 'Sydney', 'address_1': 'Level 3, 71 York St', 'lon': 151.2057342529297, 'id': 25791855, 'localized_country_name': 'Australia', 'lat': -33.86824417114258, 'country': 'au'}
{'duration': 9900000, 'name': 'eCommerce Friday Night Pitches', 'status': 'upcoming', 'venue': {'name': 'Fishburners', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 2/3 11-31 York St,', 'lon': 151.20526123046875, 'id': 25668721, 'localized_country_name': 'Australia', 'lat': -33.86531066894531, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-19', 'group': {'who': 'Entrepreneurs', 'name': 'Fishburners Meetup', 'urlname': 'Fishburners-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 17306242, 'created': 1412058644000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Fishburners-Meetup/events/cgglfqyxnbzb/', 'local_time': '17:15', 'yes_rsvp_count': 47, 'utc_offset': 39600000, 'description': "<p>Every Friday night from 5:15pm, Fishburners opens its doors to host startup community pitches and networking!</p> <p>If you're looking for inspiration to start a business, learn some pitch tips or just find out what new tech startups are happening in Sydney, this event is for you.</p> <p>Please note that Fishburners values inclusive communities and all events hosted here are governed by our community code of conduct. This stems from our desire to run a productive and valuable night for our founders and all attendees where everyone feels safe and welcome to attend, bring friends, meet new people, and enjoy the start up landscape.</p> <p>To make sure we have the best environment in support of this, in cases where someone is detracting from this goal they may be asked to leave for excessive drinking, antisocial behaviour, speaking during the pitches or disengagement with the purpose of the night. If any issues arise that make you feel uncomfortable please don’t hesitate to come and speak to one of the team.</p> <p>We hope that by this everyone will enjoy an even more energetic and exciting event, continuing to grow in numbers as we focus all our considerable resources on growing and supporting the skills and passions of us all in the start up industry.</p> <p>The event schedule for Friday Night Pitches is as follows:</p> <p>• 5:15PM: Networking &amp; drinks</p> <p>• 5:30PM: Pitches begin. Grab a seat! No talking during this time :)</p> <p>• 6:30PM-8PM (approx.): Networking</p> <p>• 8PM: Event concludes</p> <p>See you soon!</p> ", 'waitlist_count': 0, 'id': 'cgglfqyxnbzb', 'time': 1539929700000, 'created': 1518060109000, 'updated': 1538539543000, 'how_to_find_us': 'Come up to Level 3 in the lifts'}
{'name': 'Fishburners', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 2/3 11-31 York St,', 'lon': 151.20526123046875, 'id': 25668721, 'localized_country_name': 'Australia', 'lat': -33.86531066894531, 'country': 'au'}
{'duration': 16200000, 'name': 'HIDDEN BEHIND CITY OFFICES', 'status': 'upcoming', 'visibility': 'public', 'local_date': '2018-10-20', 'group': {'who': 'Art Enthusiasts', 'name': 'Sydney Sketch Club', 'urlname': 'art-494', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 1248488, 'created': 1218081176000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/art-494/events/254476603/', 'local_time': '09:30', 'yes_rsvp_count': 31, 'utc_offset': 39600000, 'description': '<p>Note the earlier start time</p> <p>WHERE TO FIND US</p> <p>Meet at the fabulous mural beside “Blaq Piq” Café, 11 Alberta Street (corner of Clarke Street), Sydney CBD – see photos</p> <p>WHAT IS THERE TO DRAW</p> <p>Grungy alleyways, perspective of tall buildings, and a breath of fresh air via the colourful courtyard mural of an open-air café.<br/>TRANSPORT</p> <p>Closest train station is Museum, or if travelling by bus get out the stop closest to the Downing Centre, corner of Elizabeth and Liverpool Streets and go down a small street beside “The Canopy” called Nithsdale Street and then into Clarke Street (refer to your google map</p> <p>LUNCH</p> <p>… will be at the Blaq Piq - I highly recommend the pandan pancakes !! Menu: <a href="https://www.zomato.com/sydney/blaq-piq-cbd/menu" class="linkified">https://www.zomato.com/sydney/blaq-piq-cbd/menu</a></p> <p>Jennifer’s contact if required on the day<br/>0413 45 25 15<br/>............................<br/>There will be NO dollar charge this weekend !</p> <p>DISCLAIMER. Please note that Jennifer as volunteer organiser of activities for Sydney Sketch Club is not responsible for the health or safety of members [and their guests] and therefore will not accept any liability for accidents or injuries that may occur during Meetup events at any location, whether outside in public areas or inside commercial venues. By attending any Sydney Sketch Club event you acknowledge that you accept all of the above.</p> ', 'waitlist_count': 0, 'id': '254476603', 'time': 1539988200000, 'created': 1536328037000, 'updated': 1536328037000}
error key not found
{'duration': 14400000, 'name': 'Competitive Programming Fortnightly Meetup', 'status': 'upcoming', 'venue': {'name': 'General Assembly Sydney', 'repinned': True, 'city': 'Sydney', 'address_1': '1 Market Street (Entrance off Kent Street)', 'lon': 151.20457458496094, 'id': 25573905, 'localized_country_name': 'Australia', 'lat': -33.87124252319336, 'country': 'au'}, 'visibility': 'public', 'local_date': '2018-10-20', 'group': {'who': 'Programmers', 'name': 'Sydney Competitive Programming Meetup', 'urlname': 'Sydney-Competitive-Programming-Meetup', 'region': 'en_US', 'lon': 151.2100067138672, 'id': 29227222, 'created': 1531450698000, 'localized_location': 'Sydney, Australia', 'lat': -33.869998931884766, 'join_mode': 'open', 'timezone': 'Australia/Sydney'}, 'link': 'https://www.meetup.com/Sydney-Competitive-Programming-Meetup/events/qrcjgqyxnbbc/', 'local_time': '12:00', 'yes_rsvp_count': 7, 'utc_offset': 39600000, 'description': "<p>Welcome to our meetup!</p> <p>Special thanks to General Assembly, who've offered up some space at their campus to run our meetup, as we now have a home to host our events.</p> <p>As this is a new group, the meetup structure will be pretty dynamic to start with. This is a chance for you to help shape what you want from this meetup and how you think it is best run! So please give us your feedback and we develop a structure that's best for us all.</p> <p>What you can expect from the event is:</p> <p>1. MICRO-TALK &amp; INTRODUCTION: We will start the event with an introduction to the coordinators, discuss the rules and introduce the challenges for the day. In the future we will be getting the community to present on things such as their favourite packages, design practices and quirks of certain languages.</p> <p>2. MICRO-HACKATHON: Next we will split off and begin to develop our solutions. You can work on your own, or band together into teams to solve problems together.</p> <p>3. PRESENTATIONS &amp; NETWORKING: Finally we will present our solutions to one another, crown a winner and come together to meet and chat. This is a social meetup remember!</p> <p>You will need to bring with you your own laptop, but we are happy to help you set up your development environment for newbies. This event is for all skill levels. Don't feel intimidated if you are new to coding!</p> <p>ABOUT OUR PARTNER<br/>========================</p> <p>General Assembly is a pioneer in education and career transformation, specializing in today’s most in-demand skills. The leading source for training, staffing, and career transitions, we foster a flourishing community of professionals pursuing careers they love.</p> ", 'waitlist_count': 0, 'id': 'qrcjgqyxnbbc', 'time': 1539997200000, 'created': 1536283389000, 'updated': 1537435851000}
{'name': 'General Assembly Sydney', 'repinned': True, 'city': 'Sydney', 'address_1': '1 Market Street (Entrance off Kent Street)', 'lon': 151.20457458496094, 'id': 25573905, 'localized_country_name': 'Australia', 'lat': -33.87124252319336, 'country': 'au'}
In [78]:
print(reqjs[0]['venue'])
{'name': 'Airtasker', 'repinned': True, 'city': 'Sydney', 'address_1': 'Level 3, 71 York St', 'lon': 151.2057342529297, 'id': 25791855, 'localized_country_name': 'Australia', 'lat': -33.86824417114258, 'country': 'au'}
In [83]:
import arrow
In [84]:
timnow = arrow.now()
In [88]:
print(timnow.datetime)
2018-10-10 05:50:24.110369+00:00
In [98]:
def createfutloc(customer_id):
    return(dict({'customer_id' : customer_id, 'name' : reqjs[0]['venue']['name'], 'address' : reqjs[0]['venue']['address_1'] +  ' ' + reqjs[0]['venue']['city'] + ' ' + reqjs[0]['venue']['localized_country_name'], 'seen_at' : str(timnow.datetime)}))
In [99]:
createfutloc('hammers@gmail.com')
Out[99]:
{'address': 'Level 3, 71 York St Sydney Australia',
 'customer_id': 'hammers@gmail.com',
 'name': 'Airtasker',
 'seen_at': '2018-10-10 05:50:24.110369+00:00'}
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
  {
    "customer_id": "00:14:22:01:23:45",
    "venue_id": "FJHKL334",
    "name": "Level 1",
    "address": "3 Drewberry Lane",
    "seen_at": "2017-11-29T08:09:57Z"
  }
In [79]:
reqjs[0]['venue']['']
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-79-c21a863fd78f> in <module>()
----> 1 reqjs[0]['venue']['']

KeyError: ''
In [108]:
import getmac
In [121]:
import sqlite3
In [122]:
connid = sqlite3.connect('identity.db')
In [123]:
c = connid.cursor()

# Create table
c.execute('''CREATE TABLE identify
             (first_name, last_name, email, birthdate, gender, marketing_consent)''')

# Insert a row of data
c.execute("INSERT INTO identify VALUES ('{}','{}','{}', '{}', '{}', '{}')".format(first_name, last_name, email, birthdate, gender, marketing_consent))

# Save (commit) the changes
connid.commit()

# We can also close the connection if we are done with it.
# Just be sure any changes have been committed or they will be lost.
connid.close()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-123-0777da33ac1d> in <module>()
      6 
      7 # Insert a row of data
----> 8 c.execute("INSERT INTO identify VALUES ('{}','{}','{}', '{}', '{}', '{}')".format(first_name, last_name, email, birthdate, gender, marketing_consent))
      9 
     10 # Save (commit) the changes

NameError: name 'first_name' is not defined
In [124]:
def createdb(namedb):
    connid = sqlite3.connect('{}.db'.format(namedb))
    c.execute('''CREATE TABLE identify
             (first_name, last_name, email, birthdate, gender, marketing_consent)''')
    connid.commit()
    connid.close()
    
In [126]:
createdb('heo')
ERROR:root:An unexpected error occurred while tokenizing input
The following traceback may be corrupted or invalid
The error message is: ('EOF in multi-line string', (1, 81))

---------------------------------------------------------------------------
OperationalError                          Traceback (most recent call last)
<ipython-input-126-ed50444a16cb> in <module>()
----> 1 createdb('heo')

<ipython-input-124-b4c1457897fa> in createdb(namedb)
      2     connid = sqlite3.connect('{}.db'.format(namedb))
      3     c.execute('''CREATE TABLE identify
----> 4              (first_name, last_name, email, birthdate, gender, marketing_consent)''')
      5     connid.commit()
      6     connid.close()

OperationalError: table identify already exists
In [ ]:
 
In [127]:
def createsqprofile(first_name, last_name, email, marketing_consent, birthdate, gender):
    connid = sqlite3.connect('identity.db')
    c = connid.cursor()
    c.execute("INSERT INTO identify VALUES ('{}','{}','{}', '{}', '{}', '{}')".format(first_name, last_name, email, birthdate, gender, marketing_consent))
    connid.commit()
    connid.close()

    #db.insert({'first_name' : first_name, 'last_name' : last_name, 'email' : email, 'marketing_consent' : marketing_consent, 'birthdate' : birthdate, 'gender' : gender})
    #return('Hello {} {}. Your email is {}. Marketing opt is {}. Your birthdate is {} and you are a {}'.format(first_name, last_name, email, marketing_consent, birthdate, gender))
    return({'first_name' : first_name, 'last_name' : last_name, 'email' : email, 'marketing_consent' : marketing_consent, 'birthdate' : birthdate, 'gender' : gender})
In [128]:
createsqprofile('william', 'mckee', 'hammersmake@gmail.com', 'True', '1974-10-10', 'male')
Out[128]:
{'birthdate': '1974-10-10',
 'email': 'hammersmake@gmail.com',
 'first_name': 'william',
 'gender': 'male',
 'last_name': 'mckee',
 'marketing_consent': 'True'}
In [ ]:
 
In [ ]:
def mkdatabspro((first_name, last_name, email, marketing_consent, birthdate, gender):
In [ ]:
conn = sqlite3.connect('example.db')
c = conn.cursor()
c.execute("INSERT INTO identify VALUES ('{}','William','Mckee','hammer@gmail.com', '1974-08-01', 'male', 'True')".format(getmac.get_mac_address()))
conn.commit()
conn.close()

blogpost

In [141]:
import requests
import shutil
import os
import getpass
from urllib.parse import urlparse
import json
import arrow
import glob
import os
import arrow
import time
import subprocess
In [115]:
def mkyr(blogpath):
    raw = arrow.now()
    if raw.strftime("%Y") not in os.listdir(blogpath + '/galleries'):
        os.mkdir(blogpath + '/galleries/' + raw.strftime("%Y"))
        #return(raw.strftime("%Y"))
    else:
        return('ERROR: Year already exists')
In [116]:
def mkmth(blogpath):
    raw = arrow.now()
    if raw.strftime("%m") not in os.listdir(blogpath + '/galleries/' + raw.strftime("%Y")):
        os.mkdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m"))
        return(raw.strftime("%m"))
    else:
        return('ERROR: Month already exists')
In [117]:
def mkday(blogpath):
    raw = arrow.now()
    if raw.strftime("%d") not in os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m")):
        os.mkdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d'))
        return(raw.strftime('%d'))
    else:
        return('ERROR: Day already exists')
In [139]:
def cpdayimg(orginpath, blogpath):
    #copies images from origin folder to blog folder
    raw = arrow.now()
    files = os.listdir(orginpath)
    for f in files:
        shutil.copy(orginpath + '/' + f, blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d'))
    return(os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d')))    
In [140]:
cpdayimg('/home/wcmckee/imgtest', '/home/wcmckee/artctrl-test/')
Out[140]:
['1017-05_BTS_WEB_HOME_wk14_2.jpg',
 '1017-05_BTS_WEB_HOME_wk14_28.jpg',
 '1017-05_BTS_WEB_HOME_wk14_32.jpg',
 '1017-05_BTS_WEB_HOME_wk14_30.jpg',
 '1017-05_BTS_WEB_HOME_wk14_26.jpg',
 'test.jpg',
 '1017-05_BTS_WEB_HOME_wk14_24.jpg',
 'david.jpg',
 '1017-05_BTS_WEB_HOME_wk14_16.jpg',
 '1017-05_BTS_WEB_HOME_wk14_10.jpg',
 '1017-05_BTS_WEB_HOME_wk14_4.jpg']
In [129]:
def mkblogpost(blogpath, postname, tagblog):
    raw = arrow.now()
    fultim = raw.datetime
    
    if postname + '.md' not in os.listdir(blogpath + '/posts'):
        with open(blogpath + '/posts/' + postname + '.meta', 'w') as daympo:
            daympo.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: {}\n.. link:\n.. description:\n.. type: text'.format(postname, postname, fultim, tagblog))
            
        with open(blogpath + '/posts/' + postname + '.md', 'w') as daymark:
            for toar in os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d')):

                daymark.write('![{}]({}{})\n\n'.format(toar.replace('.png', ''), '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d') + '/', toar))

    else:
        return('ERROR: post already exists')
In [178]:
import awscli
In [179]:
awscli.en
Out[179]:
<module 'awscli' from '/usr/lib/python3/dist-packages/awscli/__init__.py'>
In [181]:
awscli.EnvironmentVariables
Out[181]:
{'ca_bundle': ('ca_bundle', 'AWS_CA_BUNDLE', None, None),
 'output': ('output', 'AWS_DEFAULT_OUTPUT', 'json', None)}
In [ ]:
 
In [136]:
def syncblogpost():
    #rsync the galleries and post files to various services - aws and digitalocean bucket. 
  File "<ipython-input-136-95d777530b3b>", line 2
    #rsync the galleries and post files to various services - aws and digitalocean bucket.
                                                                                           ^
SyntaxError: unexpected EOF while parsing

folder of images. create request that creates json of image name, id, and text descrion. this is the text that goes after the image.

In [163]:
def imgjstru(blogpath, postname):
    sampdict = dict()
    raw = arrow.now()
    for osli in os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d')):
        #print(osli)
        sampdict.update({osli : dict({'id' : 'one', 'text' : 'sampletext'})})
    #os.listdir(blogpath + '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d'))
    return(sampdict)
In [173]:
def singimg(blogpath, postname, imgname, imgtxt):
    sampdict = dict()
    raw = arrow.now()
  
    with open(blogpath + '/posts/' + postname + '.md', 'a') as daymark:
        daymark.write('![{}]({}{})\n\n{}\n\n'.format(imgname.replace('.png', ''), '/galleries/' + raw.strftime("%Y") + '/' + raw.strftime("%m") + '/' + raw.strftime('%d') + '/', imgname, imgtxt))

    
In [174]:
singimg('/home/wcmckee/artctrl-test', 'omgz', 'hello.png', 'this is lame')
In [175]:
singimg('/home/wcmckee/artctrl-test', 'omgz', 'never.png', 'well this didnt look good')
In [ ]:
 
In [ ]:
 
In [164]:
sam
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-164-3ef32f8c6790> in <module>()
----> 1 sam

NameError: name 'sam' is not defined
In [165]:
imgjstru('/home/wcmckee/artctrl-test', 'testing')
Out[165]:
{'1017-05_BTS_WEB_HOME_wk14_10.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_16.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_2.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_24.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_26.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_28.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_30.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_32.jpg': {'id': 'one', 'text': 'sampletext'},
 '1017-05_BTS_WEB_HOME_wk14_4.jpg': {'id': 'one', 'text': 'sampletext'},
 'david.jpg': {'id': 'one', 'text': 'sampletext'},
 'test.jpg': {'id': 'one', 'text': 'sampletext'}}
In [ ]:
 
In [ ]:
 
In [ ]:
 

bestandless

best&less account stuff

Python command line script to register, lookup, edit, delete accounts. The accounts are sorted by email address.

Changes from current:

ask twice for email and password

force captive

For every account:

Creates a config file. Creates a json object. Creates a markdown blog post with python-nikola. Creates socially awkward seal GOODBYE FIRSTNAME, HELLO FIRSTNAME meme.

3 database record of previous entries.

gender - display male.jpg for male and female.jpg for female

Gets art quote of the day and replaces art with fashion. add fashion to middle of authors name. add quote to top of blog post. TODO: display quote as image

random choice of clothes. Pregenerated random, recommended. currently random choice of -

In [ ]:
 
In [1]:
import requests
import bs4
import getpass
import json
import shutil
import PIL
import json
from PIL import ImageDraw, ImageFont
import os
#import arrow
import configparser
In [2]:
myusr = getpass.getuser()
In [3]:
import arrow
In [4]:
loginda = input('Enter first name: ')
loglast = input('Enter last name: ')
print('Suggestions {}{}'.format(loginda, loglast))
userset = input('Enter username: ')
logemail = input('Enter email: ')
emailtwic = input('Enter email again: ')
if logemail == emailtwic:
    print('Email are the same')
else:
    print('Email is not the same')
        
bodenter = input('date of birth: YEAR/MNTH/DAY: ')
datebith = arrow.get(bodenter)
datebithz = datebith.strftime('%Y')
    
passent = getpass.getpass('Enter password: ')
passagain = getpass.getpass('Enter password again: ')
if passent == passagain:
    print('They are correct')
    
else:
    print('They are not correct')
        
imgend = input('gender: ')
    
    #import ConfigParser

config = configparser.RawConfigParser()

    # When adding sections or items, add them in the reverse order of
    # how you want them to be displayed in the actual file.
    # In addition, please note that using RawConfigParser's and the raw
    # mode of ConfigParser's respective set functions, you can assign
    # non-string values to keys internally, but will receive an error
    # when attempting to write to a file or when you get it in non-raw
    # mode. SafeConfigParser does not allow such assignments to take place.
config.add_section(logemail)
config.set(logemail, 'firstname', loginda)
config.set(logemail, 'lastname', loglast)
config.set(logemail, 'email', logemail)
    # Writing our configuration file to 'example.cfg'
with open('/home/pi/.config/bestless.ini', 'w') as configfile:
    config.write(configfile)
Enter first name: will
Enter last name: mckee
Suggestions willmckee
Enter username: willmckee
Enter email: will@artcontrol.me
Enter email again: will@artcontrol.me
Email are the same
date of birth: YEAR/MNTH/DAY: 2001/12/12
Enter password: ········
Enter password again: ········
They are correct
gender: male
In [5]:
import os
import random
In [92]:
if imgend == 'male':
    genpat = ('/galleries/male.jpg')
    rancho = random.choice(os.listdir('/home/pi/memetest/galleries/male/'))
    genshit = ('/galleries/male/{}'.format(rancho))
    #gmal = requests.get('https://api.gilt.com/v1/sales/men/upcoming.json?apikey=bb7cf716ec52e7a7737705f0129ed4282a35239a0a6b8a821e68f30a00ecc1a7')

elif imgend == 'female':
    genpat = ('/galleries/female.jpg')
    rancho = random.choice(os.listdir('/home/pi/memetest/galleries/female/'))
    #gmal = requests.get('https://api.gilt.com/v1/sales/women/upcoming.json?apikey=bb7cf716ec52e7a7737705f0129ed4282a35239a0a6b8a821e68f30a00ecc1a7')

    genshit = ('/galleries/female/{}'.format(rancho))
In [71]:
#salejs = gmal.json()
In [78]:
#salelen = len(salejs['sales'])
In [87]:
#for salele in range(salelen):
#    print(salejs['sales'][salele]['description'])
#    print(salejs['sales'][salele]['image_urls']['686x374'][0]['url'])
Suede boots, leather sneakers, and more for wherever the day (or night) may take you
https://cdn1.gilt.com/images/share/uploads/0000/0005/1726/517269805/orig.jpg
Contemporary picks for partying or parlaying
https://cdn1.gilt.com/images/share/uploads/0000/0005/1752/517526486/orig.jpg
Stay stylishly warm with gloves, scarves, and more
https://cdn1.gilt.com/images/share/uploads/0000/0005/1756/517563618/orig.jpg
Get your wardrobe together at prices you can’t miss
https://cdn1.gilt.com/images/share/uploads/0000/0005/1702/517028519/orig.jpg
The essentials you need to upgrade your look at can’t-beat prices
https://cdn1.gilt.com/images/share/uploads/0000/0005/1756/517565775/orig.jpg
The perfect layering piece to keep you warm in style
https://cdn1.gilt.com/images/share/uploads/0000/0005/1702/517027992/orig.jpg
Minimalist styles for the modern man
https://cdn1.gilt.com/images/share/uploads/0000/0005/1731/517318169/orig.jpg
Sophisticated pieces for an instantly pulled-together look
https://cdn1.gilt.com/images/share/uploads/0000/0005/1726/517269652/orig.jpg
Get your dapper look together—before the ball drops
https://cdn1.gilt.com/images/share/uploads/0000/0005/1656/516565162/orig.jpg
Handsome selects that complete your look
https://cdn1.gilt.com/images/share/uploads/0000/0005/1727/517270236/orig.jpg
From colorful to classic, find the shape that suits you best
https://cdn1.gilt.com/images/share/uploads/0000/0005/1726/517269900/orig.jpg
Elevated basics, from undershirts to briefs
https://cdn1.gilt.com/images/share/uploads/0000/0005/1727/517276348/orig.jpg
In [65]:
#requests.get('https://api.gilt.com/v1/sales/men/upcoming.json?apikey=bb7cf716ec52e7a7737705f0129ed4282a35239a0a6b8a821e68f30a00ecc1a7')
Out[65]:
<Response [200]>
In [ ]:
#https://api.gilt.com/v1/sales/women/active.json?apikey=bb7cf716ec52e7a7737705f0129ed4282a35239a0a6b8a821e68f30a00ecc1a7
In [7]:
genshit
Out[7]:
'/galleries/male/male-shorts.jpeg'
In [8]:
#import random
In [9]:
#ranum = random.randint(100,1000)
In [10]:
#for it in range(100,1000):
#    print('suggestion: {}{}{}'.format(loginda, loglast, it))
In [11]:
#ranum
In [12]:
#reqgif = requests.get('http://api.giphy.com/v1/gifs/search?q={}+fashion&api_key=ee58ff1d10c54fd29ddb0388126c2bcd'.format(datebith))
In [13]:
#gifjs = reqgif.json()
In [14]:
#for himg in range(25):
    
#    img_data = requests.get(gifjs['data'][himg]['images']['fixed_height']['url']).content
#    with open('{}.gif'.format(str(himg)), 'wb') as handler:
#        handler.write(img_data)
#    print(gifjs['data'][himg]['images']['fixed_height']['url'])
In [15]:
with open('/home/{}/account.json'.format(myusr), 'r') as accdict:
    readd = accdict.read()
    readdict = json.loads(readd)
In [16]:
#readdict = json.loads(readd)
In [17]:
#print(readdict)
In [18]:
emailup = input('Email to lookup: ')
Email to lookup: hammersmake@gmail.com
In [19]:
print(readdict[emailup])
{'lastname': 'Bill', 'email': 'hammersmake@gmail.com', 'password': 'hashthis', 'agehuman': '15 years ago', 'firstname': 'Will', 'gender': 'male', 'id': 7, 'dob': '2001/12/25'}
In [20]:
emailcont = ('Hello {},\n\nToday we have sale on {}. It is suitable for someone born {}.\n\nHave a great day,\n\nbest&less.'.format(loginda, imgend, datebith.humanize(), str(datebithz)))
In [21]:
emailcont
Out[21]:
'Hello will,\n\nToday we have sale on male. It is suitable for someone born 16 years ago.\n\nHave a great day,\n\nbest&less.'
In [ ]:
 
In [22]:
lenid = len(readdict)
In [23]:
nexid = lenid + 1
In [24]:
textzero = 'BYE ' + loginda 
textone = 'HELLO ' + loginda
upzero = textzero.upper()

botzero = textone.upper()


# In[ ]:
In [25]:
botzero
Out[25]:
'HELLO WILL'
In [26]:
#gheigh = (gtm['height'])
#gwth = (gtm['width'])
        #response = requests.get(gtm['url'], stream=True)
        #with open('{}{}-reference.jpg'.format(repathz, str(rdz.author)), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
        
        #with open('/home/{}/memetest/galleries/{}.png'.format(myusr, gtm['id']), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
            
img = PIL.Image.open('/home/{}/Downloads/seal.jpg'.format(myusr))

imageSize = img.size
print(imageSize)
        # find biggest font size that works
fontSize = int(imageSize[1]/5)
print(fontSize)
font = ImageFont.truetype("/home/{}/impact.ttf".format(myusr), fontSize)
topTextSize = font.getsize(upzero)
bottomTextSize = font.getsize(botzero)
print(topTextSize)
while topTextSize[0] > imageSize[0]-20 or bottomTextSize[0] > imageSize[0]-20:
    fontSize = fontSize - 1
    font = ImageFont.truetype("/home/{}/impact.ttf".format(myusr), fontSize)
    topTextSize = font.getsize(upzero)
    bottomTextSize = font.getsize(botzero)
    print(bottomTextSize)

        # find top centered position for top text
topTextPositionX = (imageSize[0]/2) - (topTextSize[0]/2)
topTextPositionY = 0
topTextPosition = (topTextPositionX, topTextPositionY)

        # find bottom centered position for bottom text
bottomTextPositionX = (imageSize[0]/2) - (bottomTextSize[0]/2)
bottomTextPositionY = imageSize[1] - bottomTextSize[1] -10
bottomTextPosition = (bottomTextPositionX, bottomTextPositionY)

draw = ImageDraw.Draw(img)

outlineRange = int(fontSize/15)
for x in range(-outlineRange, outlineRange+1):
    for y in range(-outlineRange, outlineRange+1):
            draw.text((topTextPosition[0]+x, topTextPosition[1]+y), upzero, (0,0,0), font=font)
            draw.text((bottomTextPosition[0]+x, bottomTextPosition[1]+y), botzero, (0,0,0), font=font)

    draw.text(topTextPosition, upzero, (255,255,255), font=font)
    draw.text(bottomTextPosition, botzero, (255,255,255), font=font)

    img.save("/home/{}/memetest/galleries/{}.jpg".format(myusr, str(nexid)))
            #print(gtm['id'])
            #filemh = gtm['id']
    #print('hello')
(620, 397)
79
(275, 80)
In [27]:
#reqote = requests.get('https://www.goodreads.com/quotes/tag/clothes')
In [28]:
somequote = requests.get('http://quotes.rest/qod.json?category=art')
In [29]:
quotejs = (somequote.json())
In [30]:
myqute = (quotejs['contents']['quotes'][0]['quote'])
In [31]:
lenqute = (quotejs['contents']['quotes'][0]['length'])
In [32]:
qutefas = myqute.replace('art', 'fashion')
In [33]:
qutefas
Out[33]:
'The whole culture is telling you to hurry, while the fashion tells you to take your time. Always listen to the fashion.'
In [34]:
quoteauth = quotejs['contents']['quotes'][0]['author']
In [35]:
auspit = quoteauth.split(' ')
In [36]:
fashauthz = (auspit[0] + " 'fashion' " + auspit[1])
In [37]:
bothquote = '"' + qutefas + '" - ' + fashauthz
In [38]:
print(bothquote)
"The whole culture is telling you to hurry, while the fashion tells you to take your time. Always listen to the fashion." - Junot 'fashion' Diaz
In [ ]:
 
In [39]:
print(lenqute)
111
In [40]:
lenqute
Out[40]:
'111'
In [41]:
#if lenqute < 20:
    
In [42]:
#print(quotejs['contents']['quotes'][0]['id'])
In [43]:
#float(lenqute) /2
In [44]:
#gheigh = (gtm['height'])
#gwth = (gtm['width'])
        #response = requests.get(gtm['url'], stream=True)
        #with open('{}{}-reference.jpg'.format(repathz, str(rdz.author)), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
        
        #with open('/home/{}/memetest/galleries/{}.png'.format(myusr, gtm['id']), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
            
img = PIL.Image.open('/home/pi/Downloads/seal.jpg')

imageSize = img.size
print(imageSize)
        # find biggest font size that works
fontSize = int(imageSize[1]/5)
print(fontSize)
font = ImageFont.truetype("/home/{}/impact.ttf".format(myusr), fontSize)
topTextSize = font.getsize(quotejs['contents']['quotes'][0]['quote'])
#bottomTextSize = font.getsize(quotejs['contents']['quotes'][0]['quote'])
print(topTextSize)
while topTextSize[0] > imageSize[0]-20:
    fontSize = fontSize - 1
    font = ImageFont.truetype("/home/{}/impact.ttf".format(myusr), fontSize)
    topTextSize = font.getsize(quotejs['contents']['quotes'][0]['quote'])
    #bottomTextSize = font.getsize(quotejs['contents']['quotes'][0]['quote'])
    #print(bottomTextSize)

        # find top centered position for top text
topTextPositionX = (imageSize[0]/2) - (topTextSize[0]/2)
topTextPositionY = 100
topTextPosition = (topTextPositionX, topTextPositionY)

        # find bottom centered position for bottom text
#bottomTextPositionX = (imageSize[0]/2) - (bottomTextSize[0]/2)
#bottomTextPositionY = imageSize[1] - bottomTextSize[1] -10
#bottomTextPosition = (bottomTextPositionX, bottomTextPositionY)

draw = ImageDraw.Draw(img)

outlineRange = int(fontSize/15)
for x in range(-outlineRange, outlineRange+1):
    for y in range(-outlineRange, outlineRange+1):
            draw.text((topTextPosition[0]+x, topTextPosition[1]+y), quotejs['contents']['quotes'][0]['quote'], (0,0,0), font=font)
            #draw.text((bottomTextPosition[0]+x, bottomTextPosition[1]+y), quotejs['contents']['quotes'][0]['quote'], (0,0,0), font=font)

    draw.text(topTextPosition, quotejs['contents']['quotes'][0]['quote'], (255,255,255), font=font)
    #draw.text(bottomTextPosition, quotejs['contents']['quotes'][0]['quote'], (255,255,255), font=font)
    img2 = img.crop((0, 80, 610,200))
    #img2.save("img2.jpg")

    img2.save("/home/{}/memetest/galleries/{}.jpg".format(myusr, str(quotejs['contents']['quotes'][0]['id'])))
            #print(gtm['id'])
            #filemh = gtm['id']
    #print('hello')
(620, 397)
79
(3317, 88)
In [ ]:
 
In [ ]:
 
In [45]:
#print(reqote)
In [46]:
#quotbs = bs4.BeautifulSoup(reqote.text)
In [47]:
#print(quotbs)
In [48]:
#allquote = quotbs.find('div', {'class' : "quoteText"})
In [49]:
#print(quotbs.find_all('a'))
In [50]:
#for allq in allquote:
#    print(allq)
In [ ]:
 
In [51]:
accdict = ({logemail : dict({'firstname' : loginda, 'lastname' : loglast, 'email' : logemail, 'password' : 'hashthis', 'gender': imgend, 'agehuman' : datebith.humanize(),'dob' : bodenter, 'id' : nexid, 'image' : '/galleries/{}.jpg', 'username' : userset, 'post' : '/posts/{}.md'.format(str(nexid), str(nexid))})})
In [52]:
#print(accdict)
In [53]:
import json
In [54]:
z = {**readdict, **accdict}
In [55]:
z
Out[55]:
{'beer@gmail.com': {'agehuman': '14 years ago',
  'dob': '2003/12/12',
  'email': 'beer@gmail.com',
  'firstname': 'be',
  'gender': 'male',
  'id': 17,
  'image': '/galleries/{}.jpg',
  'lastname': 'er',
  'password': 'hashthis',
  'post': '/posts/17.md',
  'username': 'beer'},
 'debian@deb.com': {'agehuman': '15 years ago',
  'dob': '2001/12/25',
  'email': 'debian@deb.com',
  'firstname': 'deb',
  'gender': 'male',
  'id': 8,
  'lastname': 'ian',
  'password': 'hashthis'},
 'dude@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/12/12',
  'email': 'dude@gmail.com',
  'firstname': 'dude',
  'gender': 'female',
  'id': 14,
  'image': '/galleries/{}.jpg',
  'lastname': 'liss',
  'password': 'hashthis',
  'post': '/posts/14.md',
  'username': 'dudeliss'},
 'dude@hayden.com': {'agehuman': '16 years ago',
  'dob': '2001/09/11',
  'email': 'dude@hayden.com',
  'firstname': 'hay',
  'gender': 'male',
  'id': 10,
  'lastname': 'den',
  'password': 'hashthis'},
 'funguy@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/09/12',
  'email': 'funguy@gmail.com',
  'firstname': 'fun',
  'gender': 'male',
  'id': 12,
  'image': '/galleries/{}.jpg',
  'lastname': 'nny',
  'password': 'hashthis',
  'post': '/posts/12.md'},
 'hammersmake@gmail.com': {'agehuman': '15 years ago',
  'dob': '2001/12/25',
  'email': 'hammersmake@gmail.com',
  'firstname': 'Will',
  'gender': 'male',
  'id': 7,
  'lastname': 'Bill',
  'password': 'hashthis'},
 'holley@mckee.com': {'agehuman': '29 years ago',
  'dob': '1988/03/09',
  'email': 'holley@mckee.com',
  'firstname': 'Holley',
  'gender': 'female',
  'lastname': 'Mckee',
  'password': 'hashthis'},
 'joe@bin.com': {'agehuman': '72 years ago',
  'dob': '1945/12/25',
  'email': 'joe@bin.com',
  'firstname': 'joe',
  'gender': 'male',
  'lastname': 'bin',
  'password': 'hashthis'},
 'lin@gmail.com': {'agehuman': '15 years ago',
  'dob': '2001/12/25',
  'email': 'lin@gmail.com',
  'firstname': 'lin',
  'gender': 'female',
  'id': 7,
  'lastname': 'dea',
  'password': 'hashthis'},
 'losers@gmail.com': {'agehuman': '29 years ago',
  'dob': '1988/04/01',
  'email': 'losers@gmail.com',
  'firstname': 'los',
  'gender': 'male',
  'id': 16,
  'image': '/galleries/{}.jpg',
  'lastname': 'ers',
  'password': 'hashthis',
  'post': '/posts/16.md',
  'username': 'losers'},
 'myself@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/02/12',
  'email': 'myself@gmail.com',
  'firstname': 'myse',
  'gender': 'male',
  'id': 18,
  'image': '/galleries/{}.jpg',
  'lastname': 'lf',
  'password': 'hashthis',
  'post': '/posts/18.md',
  'username': 'myself'},
 'p@moodley.com': {'agehuman': '41 years ago',
  'dob': '1976/12/12',
  'email': 'p@moodley.com',
  'firstname': 'Peter',
  'gender': 'female',
  'lastname': 'Moodley',
  'password': 'hashthis'},
 'qwerty@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/09/11',
  'email': 'qwerty@gmail.com',
  'firstname': 'qwe',
  'gender': 'male',
  'id': 13,
  'image': '/galleries/{}.jpg',
  'lastname': 'erty',
  'password': 'hashthis',
  'post': '/posts/13.md',
  'username': 'qwerty'},
 'random@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/09/11',
  'email': 'random@gmail.com',
  'firstname': 'ran',
  'gender': 'male',
  'id': 11,
  'lastname': 'dom',
  'password': 'hashthis'},
 'samsung@gmail.com': {'agehuman': '5 years ago',
  'dob': '2012/02/02',
  'email': 'samsung@gmail.com',
  'firstname': 'sam',
  'gender': 'male',
  'id': 21,
  'image': '/galleries/{}.jpg',
  'lastname': 'sung',
  'password': 'hashthis',
  'post': '/posts/21.md',
  'username': 'samsung'},
 'somez@gmail.com': {'agehuman': '3 years ago',
  'dob': '2013/12/25',
  'email': 'somez@gmail.com',
  'firstname': 'Someone',
  'gender': 'male',
  'lastname': 'Now',
  'password': 'hashthis'},
 'ubun@nut.com': {'agehuman': '16 years ago',
  'dob': '2001/11/11',
  'email': 'ubun@nut.com',
  'firstname': 'ubun',
  'gender': 'male',
  'id': 9,
  'lastname': 'tu',
  'password': 'hashthis'},
 'wazza@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/02/02',
  'email': 'wazza@gmail.com',
  'firstname': 'waz',
  'gender': 'female',
  'id': 20,
  'image': '/galleries/{}.jpg',
  'lastname': 'za',
  'password': 'hashthis',
  'post': '/posts/20.md',
  'username': 'wazza'},
 'will@art.com': {'agehuman': '29 years ago',
  'dob': '1988/12/12',
  'email': 'will@art.com',
  'firstname': 'seal',
  'gender': 'female',
  'id': 15,
  'image': '/galleries/{}.jpg',
  'lastname': 'aermy',
  'password': 'hashthis',
  'post': '/posts/15.md',
  'username': 'sealaermy'},
 'will@artcontrol.me': {'agehuman': '16 years ago',
  'dob': '2001/12/12',
  'email': 'will@artcontrol.me',
  'firstname': 'will',
  'gender': 'male',
  'id': 22,
  'image': '/galleries/{}.jpg',
  'lastname': 'mckee',
  'password': 'hashthis',
  'post': '/posts/22.md',
  'username': 'willmckee'},
 'willmckee@gmail.com': {'agehuman': '16 years ago',
  'dob': '2001/12/04',
  'email': 'willmckee@gmail.com',
  'firstname': 'will',
  'gender': 'male',
  'id': 19,
  'image': '/galleries/{}.jpg',
  'lastname': 'mckee',
  'password': 'hashthis',
  'post': '/posts/19.md',
  'username': 'willmckee'},
 'wow@gmail.com': {'agehuman': '32 years ago',
  'dob': '1985/12/04',
  'email': 'wow@gmail.com',
  'firstname': 'woowa',
  'gender': 'male',
  'id': 6,
  'lastname': 'never',
  'password': 'hashthis'}}
In [56]:
#json.dumps(accdict)
In [57]:
with open('/home/pi/account.json', 'w') as blacc:
    blacc.write(json.dumps(z))
          
In [58]:
#cat /home/pi/account.json
In [59]:
timnow = arrow.now()
In [60]:
timnowz = timnow.datetime
In [61]:
#print(timnowz)
In [62]:
with open('/home/{}/memetest/posts/{}.md'.format(myusr, str(nexid)), 'w') as resulmd:
    resulmd.write('{}\n\n![{}](/galleries/{}.jpg)\n\n First name: {}\n\nLast name: {}\n\nEmail: {}\n\nGender: {}\n\n{}\n\n![gender]({})\n\n![cloth]({})'.format(bothquote, str(nexid), str(nexid), loginda, loglast, logemail, imgend, emailcont, genpat, genshit))
        
with open ('/home/{}/memetest/posts/{}.meta'.format(myusr, str(nexid)), 'w') as opmetat:
    #opmetat.write("{}".format(str(curtim))
            #for arage in alltags:
            #    print(arage)
    opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(str(nexid), str(nexid), timnowz))
In [63]:
#with open('/home/pi/hugosite/content/post/{}.md'.format(str(nexid)), 'w') as hupost:
 #   hupost.write('+++\ndate = "{}"\ntitle = {}\n\n+++\n\nFirst name: {}\n\nLast name: {}\n\nUsername: {}\n\nEmail: {}\n\nGender: {}'.format(timnowz, str(nexid), loginda, loglast, logemail, imgend))
    #hupost.write('+++\ndate = "{}"\ntitle = {}\n\n+++\n\nFirst name: '.format(timnowz))
In [91]:
#for salele in range(salelen):
    

#    descript = (salejs['sales'][salele]['description'])
#    imurl = (salejs['sales'][salele]['image_urls']['686x374'][0]['url'])
#    with open('/home/{}/memetest/posts/{}.md'.format(myusr, str(nexid)), 'a') as resulmd:
#        resulmd.write('\n\n![photo]({})\n\n'.format(imurl))
In [ ]:
 
In [64]:
#cat /home/pi/hugosite/content/post/6.md
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 

memegen

Write your post here.

In [4]:
import arrow
In [5]:
#configparser.p
In [6]:
thtim = arrow.now()
In [ ]:
timnow = arrow.now()
In [8]:
thtim.timestamp
Out[8]:
1508461744
In [2]:
#import requests 
#import bs4
In [ ]:
 
In [22]:
#for memereq in range(0,21):
#    memetm = requests.get('https://imgflip.com/memetemplates?page={}'.format(memereq))
#    bsme = bs4.BeautifulSoup(memetm.text)
    #print(bsme)
#    for ahr in bsme.find_all('h3'):
#        print(ahr)
        #print(ahr.find('a'))
/usr/lib/python3/dist-packages/bs4/__init__.py:181: UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.

The code that caused this warning is on line 193 of the file /usr/lib/python3.5/runpy.py. To get rid of this warning, change code that looks like this:

 BeautifulSoup([your markup])

to this:

 BeautifulSoup([your markup], "lxml")

  markup_type=markup_type))
<h3 class="mt-title">
<a href="/meme/One-Does-Not-Simply" title="One Does Not Simply Meme">One Does Not Simply</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Batman-Slapping-Robin" title="Batman Slapping Robin Meme">Batman Slapping Robin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ancient-Aliens" title="Ancient Aliens Meme">Ancient Aliens</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Futurama-Fry" title="Futurama Fry Meme">Futurama Fry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Most-Interesting-Man-In-The-World" title="The Most Interesting Man In The World Meme">The Most Interesting Man In The World</a>
</h3>
<h3 class="mt-title">
<a href="/meme/X-Everywhere" title="X, X Everywhere Meme">X, X Everywhere</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Waiting-Skeleton" title="Waiting Skeleton Meme">Waiting Skeleton</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Leonardo-Dicaprio-Cheers" title="Leonardo Dicaprio Cheers Meme">Leonardo Dicaprio Cheers</a>
</h3>
<h3 class="mt-title">
<a href="/meme/First-World-Problems" title="First World Problems Meme">First World Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Brace-Yourselves-X-is-Coming" title="Brace Yourselves X is Coming Meme">Brace Yourselves X is Coming</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Luck-Brian" title="Bad Luck Brian Meme">Bad Luck Brian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Y-U-No" title="Y U No Meme">Y U No</a>
</h3>
<h3 class="mt-title">
<a href="/meme/That-Would-Be-Great" title="That Would Be Great Meme">That Would Be Great</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Oprah-You-Get-A" title="Oprah You Get A Meme">Oprah You Get A</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Creepy-Condescending-Wonka" title="Creepy Condescending Wonka Meme">Creepy Condescending Wonka</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Boardroom-Meeting-Suggestion" title="Boardroom Meeting Suggestion Meme">Boardroom Meeting Suggestion</a>
</h3>
<h3 class="mt-title">
<a href="/meme/But-Thats-None-Of-My-Business" title="But Thats None Of My Business Meme">But Thats None Of My Business</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Doge" title="Doge Meme">Doge</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Captain-Picard-Facepalm" title="Captain Picard Facepalm Meme">Captain Picard Facepalm</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Yall-Got-Any-More-Of" title="Yall Got Any More Of Meme">Yall Got Any More Of</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Success-Kid" title="Success Kid Meme">Success Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat" title="Grumpy Cat Meme">Grumpy Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/X-All-The-Y" title="X All The Y Meme">X All The Y</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Third-World-Skeptical-Kid" title="Third World Skeptical Kid Meme">Third World Skeptical Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Matrix-Morpheus" title="Matrix Morpheus Meme">Matrix Morpheus</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Black-Girl-Wat" title="Black Girl Wat Meme">Black Girl Wat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Picard-Wtf" title="Picard Wtf Meme">Picard Wtf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Rock-Driving" title="The Rock Driving Meme">The Rock Driving</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Philosoraptor" title="Philosoraptor Meme">Philosoraptor</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Star-Wars-Yoda" title="Star Wars Yoda Meme">Star Wars Yoda</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dr-Evil-Laser" title="Dr Evil Laser Meme">Dr Evil Laser</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Face-You-Make-Robert-Downey-Jr" title="Face You Make Robert Downey Jr Meme">Face You Make Robert Downey Jr</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Disaster-Girl" title="Disaster Girl Meme">Disaster Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Confession-Bear" title="Confession Bear Meme">Confession Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Toddler" title="Evil Toddler Meme">Evil Toddler</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Finding-Neverland" title="Finding Neverland Meme">Finding Neverland</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grandma-Finds-The-Internet" title="Grandma Finds The Internet Meme">Grandma Finds The Internet</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Am-I-The-Only-One-Around-Here" title="Am I The Only One Around Here Meme">Am I The Only One Around Here</a>
</h3>
<h3 class="mt-title">
<a href="/meme/10-Guy" title="10 Guy Meme">10 Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Too-Damn-High" title="Too Damn High Meme">Too Damn High</a>
</h3>
<h3 class="mt-title">
<a href="/meme/One-Does-Not-Simply" title="One Does Not Simply Meme">One Does Not Simply</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Batman-Slapping-Robin" title="Batman Slapping Robin Meme">Batman Slapping Robin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ancient-Aliens" title="Ancient Aliens Meme">Ancient Aliens</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Futurama-Fry" title="Futurama Fry Meme">Futurama Fry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Most-Interesting-Man-In-The-World" title="The Most Interesting Man In The World Meme">The Most Interesting Man In The World</a>
</h3>
<h3 class="mt-title">
<a href="/meme/X-Everywhere" title="X, X Everywhere Meme">X, X Everywhere</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Waiting-Skeleton" title="Waiting Skeleton Meme">Waiting Skeleton</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Leonardo-Dicaprio-Cheers" title="Leonardo Dicaprio Cheers Meme">Leonardo Dicaprio Cheers</a>
</h3>
<h3 class="mt-title">
<a href="/meme/First-World-Problems" title="First World Problems Meme">First World Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Brace-Yourselves-X-is-Coming" title="Brace Yourselves X is Coming Meme">Brace Yourselves X is Coming</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Luck-Brian" title="Bad Luck Brian Meme">Bad Luck Brian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Y-U-No" title="Y U No Meme">Y U No</a>
</h3>
<h3 class="mt-title">
<a href="/meme/That-Would-Be-Great" title="That Would Be Great Meme">That Would Be Great</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Oprah-You-Get-A" title="Oprah You Get A Meme">Oprah You Get A</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Creepy-Condescending-Wonka" title="Creepy Condescending Wonka Meme">Creepy Condescending Wonka</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Boardroom-Meeting-Suggestion" title="Boardroom Meeting Suggestion Meme">Boardroom Meeting Suggestion</a>
</h3>
<h3 class="mt-title">
<a href="/meme/But-Thats-None-Of-My-Business" title="But Thats None Of My Business Meme">But Thats None Of My Business</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Doge" title="Doge Meme">Doge</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Captain-Picard-Facepalm" title="Captain Picard Facepalm Meme">Captain Picard Facepalm</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Yall-Got-Any-More-Of" title="Yall Got Any More Of Meme">Yall Got Any More Of</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Success-Kid" title="Success Kid Meme">Success Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat" title="Grumpy Cat Meme">Grumpy Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/X-All-The-Y" title="X All The Y Meme">X All The Y</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Third-World-Skeptical-Kid" title="Third World Skeptical Kid Meme">Third World Skeptical Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Matrix-Morpheus" title="Matrix Morpheus Meme">Matrix Morpheus</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Black-Girl-Wat" title="Black Girl Wat Meme">Black Girl Wat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Picard-Wtf" title="Picard Wtf Meme">Picard Wtf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Rock-Driving" title="The Rock Driving Meme">The Rock Driving</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Philosoraptor" title="Philosoraptor Meme">Philosoraptor</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Star-Wars-Yoda" title="Star Wars Yoda Meme">Star Wars Yoda</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dr-Evil-Laser" title="Dr Evil Laser Meme">Dr Evil Laser</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Face-You-Make-Robert-Downey-Jr" title="Face You Make Robert Downey Jr Meme">Face You Make Robert Downey Jr</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Disaster-Girl" title="Disaster Girl Meme">Disaster Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Confession-Bear" title="Confession Bear Meme">Confession Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Toddler" title="Evil Toddler Meme">Evil Toddler</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Finding-Neverland" title="Finding Neverland Meme">Finding Neverland</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grandma-Finds-The-Internet" title="Grandma Finds The Internet Meme">Grandma Finds The Internet</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Am-I-The-Only-One-Around-Here" title="Am I The Only One Around Here Meme">Am I The Only One Around Here</a>
</h3>
<h3 class="mt-title">
<a href="/meme/10-Guy" title="10 Guy Meme">10 Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Too-Damn-High" title="Too Damn High Meme">Too Damn High</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Third-World-Success-Kid" title="Third World Success Kid Meme">Third World Success Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dont-You-Squidward" title="Dont You Squidward Meme">Dont You Squidward</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Awkward-Moment-Sealion" title="Awkward Moment Sealion Meme">Awkward Moment Sealion</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Yo-Dawg-Heard-You" title="Yo Dawg Heard You Meme">Yo Dawg Heard You</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Maury-Lie-Detector" title="Maury Lie Detector Meme">Maury Lie Detector</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Aaaaand-Its-Gone" title="Aaaaand Its Gone Meme">Aaaaand Its Gone</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sparta-Leonidas" title="Sparta Leonidas Meme">Sparta Leonidas</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Laughing-Men-In-Suits" title="Laughing Men In Suits Meme">Laughing Men In Suits</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Skeptical-Baby" title="Skeptical Baby Meme">Skeptical Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Aint-Nobody-Got-Time-For-That" title="Aint Nobody Got Time For That Meme">Aint Nobody Got Time For That</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Say-That-Again-I-Dare-You" title="Say That Again I Dare You Meme">Say That Again I Dare You</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Put-It-Somewhere-Else-Patrick" title="Put It Somewhere Else Patrick Meme">Put It Somewhere Else Patrick</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Conspiracy-Keanu" title="Conspiracy Keanu Meme">Conspiracy Keanu</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Pun-Dog" title="Bad Pun Dog Meme">Bad Pun Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ill-Just-Wait-Here" title="Ill Just Wait Here Meme">Ill Just Wait Here</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mugatu-So-Hot-Right-Now" title="Mugatu So Hot Right Now Meme">Mugatu So Hot Right Now</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Back-In-My-Day" title="Back In My Day Meme">Back In My Day</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Steve-Harvey" title="Steve Harvey Meme">Steve Harvey</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Be-Like-Bill" title="Be Like Bill Meme">Be Like Bill</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Socially-Awesome-Awkward-Penguin" title="Socially Awesome Awkward Penguin Meme">Socially Awesome Awkward Penguin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/And-everybody-loses-their-minds" title="And everybody loses their minds Meme">And everybody loses their minds</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rick-and-Carl" title="Rick and Carl Meme">Rick and Carl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Archer" title="Archer Meme">Archer</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spongegar" title="Spongegar Meme">Spongegar</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scumbag-Steve" title="Scumbag Steve Meme">Scumbag Steve</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Imagination-Spongebob" title="Imagination Spongebob Meme">Imagination Spongebob</a>
</h3>
<h3 class="mt-title">
<a href="/meme/This-Is-Where-Id-Put-My-Trophy-If-I-Had-One" title="This Is Where I'd Put My Trophy If I Had One Meme">This Is Where I'd Put My Trophy If I Had One</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Kill-Yourself-Guy" title="Kill Yourself Guy Meme">Kill Yourself Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Pepperidge-Farm-Remembers" title="Pepperidge Farm Remembers Meme">Pepperidge Farm Remembers</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Arthur-Fist" title="Arthur Fist Meme">Arthur Fist</a>
</h3>
<h3 class="mt-title">
<a href="/meme/See-Nobody-Cares" title="See Nobody Cares Meme">See Nobody Cares</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Uncle-Sam" title="Uncle Sam Meme">Uncle Sam</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Should-Buy-A-Boat-Cat" title="I Should Buy A Boat Cat Meme">I Should Buy A Boat Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Liam-Neeson-Taken" title="Liam Neeson Taken Meme">Liam Neeson Taken</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Marvel-Civil-War-1" title="Marvel Civil War 1 Meme">Marvel Civil War 1</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Look-At-Me" title="Look At Me Meme">Look At Me</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Buddy-Christ" title="Buddy Christ Meme">Buddy Christ</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Jackie-Chan-WTF" title="Jackie Chan WTF Meme">Jackie Chan WTF</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Unpopular-Opinion-Puffin" title="Unpopular Opinion Puffin Meme">Unpopular Opinion Puffin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spiderman-Computer-Desk" title="Spiderman Computer Desk Meme">Spiderman Computer Desk</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Overly-Attached-Girlfriend" title="Overly Attached Girlfriend Meme">Overly Attached Girlfriend</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ryan-Gosling" title="Ryan Gosling Meme">Ryan Gosling</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Peter-Griffin-News" title="Peter Griffin News Meme">Peter Griffin News</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Shut-Up-And-Take-My-Money-Fry" title="Shut Up And Take My Money Fry Meme">Shut Up And Take My Money Fry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Good-Fellas-Hilarious" title="Good Fellas Hilarious Meme">Good Fellas Hilarious</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Gollum" title="Gollum Meme">Gollum</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Leonardo-Dicaprio-Wolf-Of-Wall-Street" title="Leonardo Dicaprio Wolf Of Wall Street Meme">Leonardo Dicaprio Wolf Of Wall Street</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ermahgerd-Berks" title="Ermahgerd Berks Meme">Ermahgerd Berks</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Its-Not-Going-To-Happen" title="Its Not Going To Happen Meme">Its Not Going To Happen</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hide-the-Pain-Harold" title="Hide the Pain Harold Meme">Hide the Pain Harold</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dave-Chappelle" title="Dave Chappelle Meme">Dave Chappelle</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Member-Berries" title="Member Berries Meme">Member Berries</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sudden-Clarity-Clarence" title="Sudden Clarity Clarence Meme">Sudden Clarity Clarence</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Cute-Cat" title="Cute Cat Meme">Cute Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Obi-Wan-Kenobi" title="Obi Wan Kenobi Meme">Obi Wan Kenobi</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spiderman-Peter-Parker" title="Spiderman Peter Parker Meme">Spiderman Peter Parker</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Surprised-Koala" title="Surprised Koala Meme">Surprised Koala</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Kevin-Hart-The-Hell" title="Kevin Hart The Hell Meme">Kevin Hart The Hell</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-The-Real-MVP" title="You The Real MVP Meme">You The Real MVP</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Im-The-Captain-Now" title="I'm The Captain Now Meme">I'm The Captain Now</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Confused-Gandalf" title="Confused Gandalf Meme">Confused Gandalf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/So-I-Got-That-Goin-For-Me-Which-Is-Nice" title="So I Got That Goin For Me Which Is Nice Meme">So I Got That Goin For Me Which Is Nice</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Jack-Sparrow-Being-Chased" title="Jack Sparrow Being Chased Meme">Jack Sparrow Being Chased</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Simba-Shadowy-Place" title="Simba Shadowy Place Meme">Simba Shadowy Place</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dwight-Schrute" title="Dwight Schrute Meme">Dwight Schrute</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Afraid-To-Ask-Andy" title="Afraid To Ask Andy Meme">Afraid To Ask Andy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bender" title="Bender Meme">Bender</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Star-Wars-No" title="Star Wars No Meme">Star Wars No</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Slowpoke" title="Slowpoke Meme">Slowpoke</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Keep-Calm-And-Carry-On-Red" title="Keep Calm And Carry On Red Meme">Keep Calm And Carry On Red</a>
</h3>
<h3 class="mt-title">
<a href="/meme/What-Do-We-Want" title="What Do We Want Meme">What Do We Want</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Satisfied-Seal" title="Satisfied Seal Meme">Satisfied Seal</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Unicorn-MAN" title="Unicorn MAN Meme">Unicorn MAN</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mr-T-Pity-The-Fool" title="Mr T Pity The Fool Meme">Mr T Pity The Fool</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ron-Burgundy" title="Ron Burgundy Meme">Ron Burgundy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Too-Like-To-Live-Dangerously" title="I Too Like To Live Dangerously Meme">I Too Like To Live Dangerously</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Guarantee-It" title="I Guarantee It Meme">I Guarantee It</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Look-At-All-These" title="Look At All These Meme">Look At All These</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Good-Guy-Greg" title="Good Guy Greg Meme">Good Guy Greg</a>
</h3>
<h3 class="mt-title">
<a href="/meme/DJ-Pauly-D" title="DJ Pauly D Meme">DJ Pauly D</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Thats-a-paddlin" title="That's a paddlin' Meme">That's a paddlin'</a>
</h3>
<h3 class="mt-title">
<a href="/meme/So-I-Guess-You-Can-Say-Things-Are-Getting-Pretty-Serious" title="So I Guess You Can Say Things Are Getting Pretty Serious Meme">So I Guess You Can Say Things Are Getting Pretty Serious</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-Should-Feel-Bad-Zoidberg" title="You Should Feel Bad Zoidberg Meme">You Should Feel Bad Zoidberg</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Inception" title="Inception Meme">Inception</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Heres-Johnny" title="Heres Johnny Meme">Heres Johnny</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Baby" title="Angry Baby Meme">Angry Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/1990s-First-World-Problems" title="1990s First World Problems Meme">1990s First World Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Michael-Jackson-Popcorn" title="Michael Jackson Popcorn Meme">Michael Jackson Popcorn</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Peter-Parker-Cry" title="Peter Parker Cry Meme">Peter Parker Cry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ill-Have-You-Know-Spongebob" title="Ill Have You Know Spongebob Meme">Ill Have You Know Spongebob</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-See-Dead-People" title="I See Dead People Meme">I See Dead People</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Insanity-Wolf" title="Insanity Wolf Meme">Insanity Wolf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Lion-King" title="Lion King Meme">Lion King</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Was-Told-There-Would-Be" title="I Was Told There Would Be Meme">I Was Told There Would Be</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Impossibru-Guy-Original" title="Impossibru Guy Original Meme">Impossibru Guy Original</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Approves" title="Chuck Norris Approves Meme">Chuck Norris Approves</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Table-Flip-Guy" title="Table Flip Guy Meme">Table Flip Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chef-Gordon-Ramsay" title="Chef Gordon Ramsay Meme">Chef Gordon Ramsay</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Super-Cool-Ski-Instructor" title="Super Cool Ski Instructor Meme">Super Cool Ski Instructor</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Joseph-Ducreux" title="Joseph Ducreux Meme">Joseph Ducreux</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Consuela" title="Consuela Meme">Consuela</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chubby-Bubbles-Girl" title="Chubby Bubbles Girl Meme">Chubby Bubbles Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/High-Expectations-Asian-Father" title="High Expectations Asian Father Meme">High Expectations Asian Father</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Buddy-The-Elf" title="Buddy The Elf Meme">Buddy The Elf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Warning-Sign" title="Warning Sign Meme">Warning Sign</a>
</h3>
<h3 class="mt-title">
<a href="/meme/First-Day-On-The-Internet-Kid" title="First Day On The Internet Kid Meme">First Day On The Internet Kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Cool-Cat-Stroll" title="Cool Cat Stroll Meme">Cool Cat Stroll</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Success-Kid-Original" title="Success Kid Original Meme">Success Kid Original</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Liam-Neeson-Taken-2" title="Liam Neeson Taken 2 Meme">Liam Neeson Taken 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Unhelpful-High-School-Teacher" title="Unhelpful High School Teacher Meme">Unhelpful High School Teacher</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Frustrated-Boromir" title="Frustrated Boromir Meme">Frustrated Boromir</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spiderman-Hospital" title="Spiderman Hospital Meme">Spiderman Hospital</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Plotting-Raccoon" title="Evil Plotting Raccoon Meme">Evil Plotting Raccoon</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Overly-Manly-Man" title="Overly Manly Man Meme">Overly Manly Man</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mr-Krabs-Blur-Meme" title="Mr Krabs Blur Meme Meme">Mr Krabs Blur Meme</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Persian-Cat-Room-Guardian" title="Persian Cat Room Guardian Meme">Persian Cat Room Guardian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Actual-Advice-Mallard" title="Actual Advice Mallard Meme">Actual Advice Mallard</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rick-and-Carl-Long" title="Rick and Carl Long Meme">Rick and Carl Long</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Pissed-Off-Obama" title="Pissed Off Obama Meme">Pissed Off Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Afraid-To-Ask-Andy-Closeup" title="Afraid To Ask Andy (Closeup) Meme">Afraid To Ask Andy (Closeup)</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Brian-Williams-Was-There" title="Brian Williams Was There Meme">Brian Williams Was There</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Captain-Phillips-Im-The-Captain-Now" title="Captain Phillips - I'm The Captain Now Meme">Captain Phillips - I'm The Captain Now</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scared-Cat" title="Scared Cat Meme">Scared Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Neil-deGrasse-Tyson" title="Neil deGrasse Tyson Meme">Neil deGrasse Tyson</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Good-Guy-Putin" title="Good Guy Putin Meme">Good Guy Putin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Inigo-Montoya" title="Inigo Montoya Meme">Inigo Montoya</a>
</h3>
<h3 class="mt-title">
<a href="/meme/College-Liberal" title="College Liberal Meme">College Liberal</a>
</h3>
<h3 class="mt-title">
<a href="/meme/These-Arent-The-Droids-You-Were-Looking-For" title="These Arent The Droids You Were Looking For Meme">These Arent The Droids You Were Looking For</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Yao-Ming" title="Yao Ming Meme">Yao Ming</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Smiling-Cat" title="Smiling Cat Meme">Smiling Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Today-Was-A-Good-Day" title="Today Was A Good Day Meme">Today Was A Good Day</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Minor-Mistake-Marvin" title="Minor Mistake Marvin Meme">Minor Mistake Marvin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Nuclear-Explosion" title="Nuclear Explosion Meme">Nuclear Explosion</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scary-Harry" title="Scary Harry Meme">Scary Harry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Squidward" title="Squidward Meme">Squidward</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ugly-Twins" title="Ugly Twins Meme">Ugly Twins</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Marvel-Civil-War" title="Marvel Civil War Meme">Marvel Civil War</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Money-Money" title="Money Money Meme">Money Money</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Baby-Godfather" title="Baby Godfather Meme">Baby Godfather</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Socially-Awkward-Awesome-Penguin" title="Socially Awkward Awesome Penguin Meme">Socially Awkward Awesome Penguin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hillary-Clinton" title="Hillary Clinton Meme">Hillary Clinton</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hey-Internet" title="Hey Internet Meme">Hey Internet</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Shrek-Cat" title="Shrek Cat Meme">Shrek Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/How-Tough-Are-You" title="How Tough Are You Meme">How Tough Are You</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Forever-Alone" title="Forever Alone Meme">Forever Alone</a>
</h3>
<h3 class="mt-title">
<a href="/meme/What-Year-Is-It" title="What Year Is It Meme">What Year Is It</a>
</h3>
<h3 class="mt-title">
<a href="/meme/No-Patrick" title="No Patrick Meme">No Patrick</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Computer-Guy-Facepalm" title="Computer Guy Facepalm Meme">Computer Guy Facepalm</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Look-Son" title="Look Son Meme">Look Son</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Laughing-Villains" title="Laughing Villains Meme">Laughing Villains</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Obama" title="Obama Meme">Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Welcome-To-The-Internets" title="Welcome To The Internets Meme">Welcome To The Internets</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Toddler" title="Angry Toddler Meme">Angry Toddler</a>
</h3>
<h3 class="mt-title">
<a href="/meme/1950s-Middle-Finger" title="1950s Middle Finger Meme">1950s Middle Finger</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hide-Yo-Kids-Hide-Yo-Wife" title="Hide Yo Kids Hide Yo Wife Meme">Hide Yo Kids Hide Yo Wife</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Jack-Nicholson-The-Shining-Snow" title="Jack Nicholson The Shining Snow Meme">Jack Nicholson The Shining Snow</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Cows" title="Evil Cows Meme">Evil Cows</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Permission-Bane" title="Permission Bane Meme">Permission Bane</a>
</h3>
<h3 class="mt-title">
<a href="/meme/TED" title="TED Meme">TED</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Karate-Kyle" title="Karate Kyle Meme">Karate Kyle</a>
</h3>
<h3 class="mt-title">
<a href="/meme/What-Do-We-Want-3" title="What Do We Want 3 Meme">What Do We Want 3</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Cool-Obama" title="Cool Obama Meme">Cool Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/PPAP" title="PPAP Meme">PPAP</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-The-Real-MVP-2" title="You The Real MVP 2 Meme">You The Real MVP 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-Were-The-Chosen-One-Star-Wars" title="You Were The Chosen One (Star Wars) Meme">You Were The Chosen One (Star Wars)</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Business-Cat" title="Business Cat Meme">Business Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Michael-Phelps-Death-Stare" title="Michael Phelps Death Stare Meme">Michael Phelps Death Stare</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Comic-Book-Guy" title="Comic Book Guy Meme">Comic Book Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Right-In-The-Childhood" title="Right In The Childhood Meme">Right In The Childhood</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Why-Cant-I-Hold-All-These-Limes" title="Why Can't I Hold All These Limes Meme">Why Can't I Hold All These Limes</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Its-Finally-Over" title="Its Finally Over Meme">Its Finally Over</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Excited-Minions" title="Excited Minions Meme">Excited Minions</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Alright-Gentlemen-We-Need-A-New-Idea" title="Alright Gentlemen We Need A New Idea Meme">Alright Gentlemen We Need A New Idea</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Deadpool-Surprised" title="Deadpool Surprised Meme">Deadpool Surprised</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Heavy-Breathing-Cat" title="Heavy Breathing Cat Meme">Heavy Breathing Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Troll-Face" title="Troll Face Meme">Troll Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Internet-Guide" title="Internet Guide Meme">Internet Guide</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dad-Joke-Dog" title="Dad Joke Dog Meme">Dad Joke Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Troll-Face-Colored" title="Troll Face Colored Meme">Troll Face Colored</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Batman-Smiles" title="Batman Smiles Meme">Batman Smiles</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Internet-Explorer" title="Internet Explorer Meme">Internet Explorer</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bill-Murray-Golf" title="Bill Murray Golf Meme">Bill Murray Golf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chemistry-Cat" title="Chemistry Cat Meme">Chemistry Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/If-You-Know-What-I-Mean-Bean" title="If You Know What I Mean Bean Meme">If You Know What I Mean Bean</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rick-and-Carl-3" title="Rick and Carl 3 Meme">Rick and Carl 3</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spiderman-Laugh" title="Spiderman Laugh Meme">Spiderman Laugh</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bear-Grylls" title="Bear Grylls Meme">Bear Grylls</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Computer-Guy" title="Computer Guy Meme">Computer Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Interupting-Kanye" title="Interupting Kanye Meme">Interupting Kanye</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Why-Is-The-Rum-Gone" title="Why Is The Rum Gone Meme">Why Is The Rum Gone</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Snape" title="Snape Meme">Snape</a>
</h3>
<h3 class="mt-title">
<a href="/meme/RPG-Fan" title="RPG Fan Meme">RPG Fan</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Have-No-Idea-What-I-Am-Doing" title="I Have No Idea What I Am Doing Meme">I Have No Idea What I Am Doing</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Baby-Insanity-Wolf" title="Baby Insanity Wolf Meme">Baby Insanity Wolf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Well-That-Escalated-Quickly" title="Well That Escalated Quickly Meme">Well That Escalated Quickly</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Joke-Eel" title="Bad Joke Eel Meme">Bad Joke Eel</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chainsaw-Bear" title="Chainsaw Bear Meme">Chainsaw Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Futurama-Zoidberg" title="Futurama Zoidberg Meme">Futurama Zoidberg</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Keep-Calm-And-Carry-On-Black" title="Keep Calm And Carry On Black Meme">Keep Calm And Carry On Black</a>
</h3>
<h3 class="mt-title">
<a href="/meme/No-I-Cant-Obama" title="No I Cant Obama Meme">No I Cant Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chill-Out-Lemur" title="Chill Out Lemur Meme">Chill Out Lemur</a>
</h3>
<h3 class="mt-title">
<a href="/meme/But-Thats-None-Of-My-Business-Neutral" title="But Thats None Of My Business (Neutral) Meme">But Thats None Of My Business (Neutral)</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Birthday" title="Grumpy Cat Birthday Meme">Grumpy Cat Birthday</a>
</h3>
<h3 class="mt-title">
<a href="/meme/We-Will-Rebuild" title="We Will Rebuild Meme">We Will Rebuild</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-Underestimate-My-Power" title="You Underestimate My Power Meme">You Underestimate My Power</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Austin-Powers-Honestly" title="Austin Powers Honestly Meme">Austin Powers Honestly</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Oh-No" title="Oh No Meme">Oh No</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Chef-Gordon-Ramsay" title="Angry Chef Gordon Ramsay Meme">Angry Chef Gordon Ramsay</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Challenge-Accepted-Rage-Face" title="Challenge Accepted Rage Face Meme">Challenge Accepted Rage Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hipster-Barista" title="Hipster Barista Meme">Hipster Barista</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sergeant-Hartmann" title="Sergeant Hartmann Meme">Sergeant Hartmann</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mr-Mackey" title="Mr Mackey Meme">Mr Mackey</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sidious-Error" title="Sidious Error Meme">Sidious Error</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Captain-Hindsight" title="Captain Hindsight Meme">Captain Hindsight</a>
</h3>
<h3 class="mt-title">
<a href="/meme/McKayla-Maroney-Not-Impressed" title="McKayla Maroney Not Impressed Meme">McKayla Maroney Not Impressed</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Why-Not-Both" title="Why Not Both Meme">Why Not Both</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Weird-Stuff-I-Do-Potoo" title="Weird Stuff I Do Potoo Meme">Weird Stuff I Do Potoo</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Life-Sucks" title="Life Sucks Meme">Life Sucks</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Kim-Jong-Un-Sad" title="Kim Jong Un Sad Meme">Kim Jong Un Sad</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hillary-Clinton-Cellphone" title="Hillary Clinton Cellphone Meme">Hillary Clinton Cellphone</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Socially-Awkward-Penguin" title="Socially Awkward Penguin Meme">Socially Awkward Penguin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Duck-Face-Chicks" title="Duck Face Chicks Meme">Duck Face Chicks</a>
</h3>
<h3 class="mt-title">
<a href="/meme/LOL-Guy" title="LOL Guy Meme">LOL Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Take-A-Seat-Cat" title="Take A Seat Cat Meme">Take A Seat Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Yo-Mamas-So-Fat" title="Yo Mamas So Fat Meme">Yo Mamas So Fat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Patriotic-Eagle" title="Patriotic Eagle Meme">Patriotic Eagle</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Baby" title="Evil Baby Meme">Evil Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/ZNMD" title="ZNMD Meme">ZNMD</a>
</h3>
<h3 class="mt-title">
<a href="/meme/True-Story" title="True Story Meme">True Story</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Secure-Parking" title="Secure Parking Meme">Secure Parking</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sexy-Railroad-Spiderman" title="Sexy Railroad Spiderman Meme">Sexy Railroad Spiderman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Barney-Stinson-Win" title="Barney Stinson Win Meme">Barney Stinson Win</a>
</h3>
<h3 class="mt-title">
<a href="/meme/LIGAF" title="LIGAF Meme">LIGAF</a>
</h3>
<h3 class="mt-title">
<a href="/meme/And-then-I-said-Obama" title="And then I said Obama Meme">And then I said Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Guns" title="Chuck Norris Guns Meme">Chuck Norris Guns</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Patrick-Says" title="Patrick Says Meme">Patrick Says</a>
</h3>
<h3 class="mt-title">
<a href="/meme/How-About-No-Bear" title="How About No Bear Meme">How About No Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Gotta-Go-Cat" title="Gotta Go Cat Meme">Gotta Go Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Burn-Kitty" title="Burn Kitty Meme">Burn Kitty</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dat-Boi" title="Dat Boi Meme">Dat Boi</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ceiling-Cat" title="Ceiling Cat Meme">Ceiling Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mad-Money-Jim-Cramer" title="Mad Money Jim Cramer Meme">Mad Money Jim Cramer</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ron-Swanson" title="Ron Swanson Meme">Ron Swanson</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Excited-Cat" title="Excited Cat Meme">Excited Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hipster-Ariel" title="Hipster Ariel Meme">Hipster Ariel</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Successful-Black-Man" title="Successful Black Man Meme">Successful Black Man</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Luck-Bear" title="Bad Luck Bear Meme">Bad Luck Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Not-Bad-Obama" title="Not Bad Obama Meme">Not Bad Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Surprised-Coala" title="Surprised Coala Meme">Surprised Coala</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Anti-Joke-Chicken" title="Anti Joke Chicken Meme">Anti Joke Chicken</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Oprah-You-Get-A-Car-Everybody-Gets-A-Car" title="Oprah You Get A Car Everybody Gets A Car Meme">Oprah You Get A Car Everybody Gets A Car</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Alien-Meeting-Suggestion" title="Alien Meeting Suggestion Meme">Alien Meeting Suggestion</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Facepalm-Bear" title="Facepalm Bear Meme">Facepalm Bear</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sad-Spiderman" title="Sad Spiderman Meme">Sad Spiderman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Lazy-College-Senior" title="Lazy College Senior Meme">Lazy College Senior</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Morgan-Freeman-Good-Luck" title="Morgan Freeman Good Luck Meme">Morgan Freeman Good Luck</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Star-Wars" title="Grumpy Cat Star Wars Meme">Grumpy Cat Star Wars</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sad-Keanu" title="Sad Keanu Meme">Sad Keanu</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bitch-Please" title="Bitch Please Meme">Bitch Please</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Short-Satisfaction-VS-Truth" title="Short Satisfaction VS Truth Meme">Short Satisfaction VS Truth</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scumbag-Boss" title="Scumbag Boss Meme">Scumbag Boss</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Talk-To-Spongebob" title="Talk To Spongebob Meme">Talk To Spongebob</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris" title="Chuck Norris Meme">Chuck Norris</a>
</h3>
<h3 class="mt-title">
<a href="/meme/OMG-Cat" title="OMG Cat Meme">OMG Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Know-That-Feel-Bro" title="I Know That Feel Bro Meme">I Know That Feel Bro</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Vladimir-Putin" title="Vladimir Putin Meme">Vladimir Putin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Advice-Yoda" title="Advice Yoda Meme">Advice Yoda</a>
</h3>
<h3 class="mt-title">
<a href="/meme/WTF" title="WTF Meme">WTF</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Castaway-Fire" title="Castaway Fire Meme">Castaway Fire</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Most-Interesting-Cat-In-The-World" title="The Most Interesting Cat In The World Meme">The Most Interesting Cat In The World</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Psy-Horse-Dance" title="Psy Horse Dance Meme">Psy Horse Dance</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Redditors-Wife" title="Redditors Wife Meme">Redditors Wife</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Crying-Because-Of-Cute" title="Crying Because Of Cute Meme">Crying Because Of Cute</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Samuel-Jackson-Glance" title="Samuel Jackson Glance Meme">Samuel Jackson Glance</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Pun-Anna-Kendrick" title="Bad Pun Anna Kendrick Meme">Bad Pun Anna Kendrick</a>
</h3>
<h3 class="mt-title">
<a href="/meme/TSA-Douche" title="TSA Douche Meme">TSA Douche</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Albert-Einstein-1" title="Albert Einstein 1 Meme">Albert Einstein 1</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Smiling-Jesus" title="Smiling Jesus Meme">Smiling Jesus</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Laughing-Goat" title="Laughing Goat Meme">Laughing Goat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-Get-An-X-And-You-Get-An-X" title="You Get An X And You Get An X Meme">You Get An X And You Get An X</a>
</h3>
<h3 class="mt-title">
<a href="/meme/2nd-Term-Obama" title="2nd Term Obama Meme">2nd Term Obama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Blank-Yellow-Sign" title="Blank Yellow Sign Meme">Blank Yellow Sign</a>
</h3>
<h3 class="mt-title">
<a href="/meme/OMG-Karen" title="OMG Karen Meme">OMG Karen</a>
</h3>
<h3 class="mt-title">
<a href="/meme/confession-kid" title="confession kid Meme">confession kid</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sad-Baby" title="Sad Baby Meme">Sad Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Reverse" title="Grumpy Cat Reverse Meme">Grumpy Cat Reverse</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sad-Cat" title="Sad Cat Meme">Sad Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Multi-Doge" title="Multi Doge Meme">Multi Doge</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Happy-Star-Congratulations" title="Happy Star Congratulations Meme">Happy Star Congratulations</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Obama-No-Listen" title="Obama No Listen Meme">Obama No Listen</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bazooka-Squirrel" title="Bazooka Squirrel Meme">Bazooka Squirrel</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Malicious-Advice-Mallard" title="Malicious Advice Mallard Meme">Malicious Advice Mallard</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Seriously-Face" title="Seriously Face Meme">Seriously Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Its-True-All-of-It-Han-Solo" title="It's True All of It Han Solo Meme">It's True All of It Han Solo</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Blank-Colored-Background" title="Blank Colored Background Meme">Blank Colored Background</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Persian-Cat-Room-Guardian-Single" title="Persian Cat Room Guardian Single Meme">Persian Cat Room Guardian Single</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Phone" title="Chuck Norris Phone Meme">Chuck Norris Phone</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chocolate-Spongebob" title="Chocolate Spongebob Meme">Chocolate Spongebob</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Butthurt-Dweller" title="Butthurt Dweller Meme">Butthurt Dweller</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mother-Of-God" title="Mother Of God Meme">Mother Of God</a>
</h3>
<h3 class="mt-title">
<a href="/meme/First-World-Stoner-Problems" title="First World Stoner Problems Meme">First World Stoner Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Annoying-Facebook-Girl" title="Annoying Facebook Girl Meme">Annoying Facebook Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Penguin-Gang" title="Penguin Gang Meme">Penguin Gang</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Beard-Baby" title="Beard Baby Meme">Beard Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Joker-Rainbow-Hands" title="Joker Rainbow Hands Meme">Joker Rainbow Hands</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Steve-Jobs" title="Steve Jobs Meme">Steve Jobs</a>
</h3>
<h3 class="mt-title">
<a href="/meme/PTSD-Clarinet-Boy" title="PTSD Clarinet Boy Meme">PTSD Clarinet Boy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Monkey-Business" title="Monkey Business Meme">Monkey Business</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dr-Evil" title="Dr Evil Meme">Dr Evil</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Thats-Just-Something-X-Say" title="Thats Just Something X Say Meme">Thats Just Something X Say</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Cereal-Guy-Spitting" title="Cereal Guy Spitting Meme">Cereal Guy Spitting</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Bed" title="Grumpy Cat Bed Meme">Grumpy Cat Bed</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Priority-Peter" title="Priority Peter Meme">Priority Peter</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hohoho" title="Hohoho Meme">Hohoho</a>
</h3>
<h3 class="mt-title">
<a href="/meme/No-Nappa-Its-A-Trick" title="No Nappa Its A Trick Meme">No Nappa Its A Trick</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Rock-It-Doesnt-Matter" title="The Rock It Doesnt Matter Meme">The Rock It Doesnt Matter</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scumbag-Brain" title="Scumbag Brain Meme">Scumbag Brain</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Family-Guy-Brian" title="Family Guy Brian Meme">Family Guy Brian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Confused-Lebowski" title="Confused Lebowski Meme">Confused Lebowski</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bonobo-Lyfe" title="Bonobo Lyfe Meme">Bonobo Lyfe</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Guinness-World-Record" title="Guinness World Record Meme">Guinness World Record</a>
</h3>
<h3 class="mt-title">
<a href="/meme/No-Bullshit-Business-Baby" title="No Bullshit Business Baby Meme">No Bullshit Business Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Question-Rage-Face" title="Question Rage Face Meme">Question Rage Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Homophobic-Seal" title="Homophobic Seal Meme">Homophobic Seal</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Skype" title="Skype Meme">Skype</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Finger" title="Chuck Norris Finger Meme">Chuck Norris Finger</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Intelligent-Dog" title="Intelligent Dog Meme">Intelligent Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Overjoyed" title="Overjoyed Meme">Overjoyed</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Barbosa-And-Sparrow" title="Barbosa And Sparrow Meme">Barbosa And Sparrow</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Zombie-Overly-Attached-Girlfriend" title="Zombie Overly Attached Girlfriend Meme">Zombie Overly Attached Girlfriend</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Super-Birthday-Squirrel" title="Super Birthday Squirrel Meme">Super Birthday Squirrel</a>
</h3>
<h3 class="mt-title">
<a href="/meme/So-Much-Drama" title="So Much Drama Meme">So Much Drama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rick-Grimes" title="Rick Grimes Meme">Rick Grimes</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Money-Man" title="Money Man Meme">Money Man</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Monkey-OOH" title="Monkey OOH Meme">Monkey OOH</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dallas-Cowboys" title="Dallas Cowboys Meme">Dallas Cowboys</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bill-OReilly" title="Bill OReilly Meme">Bill OReilly</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Eye-Of-Sauron" title="Eye Of Sauron Meme">Eye Of Sauron</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Marvel-Civil-War-2" title="Marvel Civil War 2 Meme">Marvel Civil War 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Kool-Kid-Klan" title="Kool Kid Klan Meme">Kool Kid Klan</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Paranoid-Parrot" title="Paranoid Parrot Meme">Paranoid Parrot</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Because-Race-Car" title="Because Race Car Meme">Because Race Car</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Frowning-Nun" title="Frowning Nun Meme">Frowning Nun</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Gasp-Rage-Face" title="Gasp Rage Face Meme">Gasp Rage Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Trailer-Park-Boys-Bubbles" title="Trailer Park Boys Bubbles Meme">Trailer Park Boys Bubbles</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Subtle-Pickup-Liner" title="Subtle Pickup Liner Meme">Subtle Pickup Liner</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Brian-Williams-Was-There-2" title="Brian Williams Was There 2 Meme">Brian Williams Was There 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Derp" title="Derp Meme">Derp</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Will-Ferrell" title="Will Ferrell Meme">Will Ferrell</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rasta-Science-Teacher" title="Rasta Science Teacher Meme">Rasta Science Teacher</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Batman-And-Superman" title="Batman And Superman Meme">Batman And Superman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Courage-Wolf" title="Courage Wolf Meme">Courage Wolf</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Not-Amused" title="Grumpy Cat Not Amused Meme">Grumpy Cat Not Amused</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Koala" title="Angry Koala Meme">Angry Koala</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Foul-Bachelor-Frog" title="Foul Bachelor Frog Meme">Foul Bachelor Frog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spiderman-Camera" title="Spiderman Camera Meme">Spiderman Camera</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Keep-Calm-And-Carry-On-Aqua" title="Keep Calm And Carry On Aqua Meme">Keep Calm And Carry On Aqua</a>
</h3>
<h3 class="mt-title">
<a href="/meme/First-World-Problems-Cat" title="First World Problems Cat Meme">First World Problems Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Han-Solo" title="Han Solo Meme">Han Solo</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Travelonshark" title="Travelonshark Meme">Travelonshark</a>
</h3>
<h3 class="mt-title">
<a href="/meme/College-Freshman" title="College Freshman Meme">College Freshman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mr-T" title="Mr T Meme">Mr T</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chavez" title="Chavez Meme">Chavez</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Happy" title="Grumpy Cat Happy Meme">Grumpy Cat Happy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/George-Washington" title="George Washington Meme">George Washington</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Christmas" title="Grumpy Cat Christmas Meme">Grumpy Cat Christmas</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rebecca-Black" title="Rebecca Black Meme">Rebecca Black</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Pathetic-Spidey" title="Pathetic Spidey Meme">Pathetic Spidey</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ermahgerd-Beyonce" title="Ermahgerd Beyonce Meme">Ermahgerd Beyonce</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ghetto-Jesus" title="Ghetto Jesus Meme">Ghetto Jesus</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Ordinary-Muslim-Man" title="Ordinary Muslim Man Meme">Ordinary Muslim Man</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Fear-And-Loathing-Cat" title="Fear And Loathing Cat Meme">Fear And Loathing Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/George-Bush" title="George Bush Meme">George Bush</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Advice-Dog" title="Advice Dog Meme">Advice Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sigmund-Freud" title="Sigmund Freud Meme">Sigmund Freud</a>
</h3>
<h3 class="mt-title">
<a href="/meme/You-Dont-Say" title="You Don't Say Meme">You Don't Say</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Socially-Awesome-Penguin" title="Socially Awesome Penguin Meme">Socially Awesome Penguin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Keep-Calm-And-Carry-On-Purple" title="Keep Calm And Carry On Purple Meme">Keep Calm And Carry On Purple</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-With-Guns" title="Chuck Norris With Guns Meme">Chuck Norris With Guns</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Officer-Cartman" title="Officer Cartman Meme">Officer Cartman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Redneck-Randal" title="Redneck Randal Meme">Redneck Randal</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sad-X-All-The-Y" title="Sad X All The Y Meme">Sad X All The Y</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Efrain-Juarez" title="Efrain Juarez Meme">Efrain Juarez</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Engineering-Professor" title="Engineering Professor Meme">Engineering Professor</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Condescending-Goku" title="Condescending Goku Meme">Condescending Goku</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Advice-God" title="Advice God Meme">Advice God</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Asian" title="Angry Asian Meme">Angry Asian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mega-Rage-Face" title="Mega Rage Face Meme">Mega Rage Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Happy-Guy-Rage-Face" title="Happy Guy Rage Face Meme">Happy Guy Rage Face</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Storytelling-Grandpa" title="Storytelling Grandpa Meme">Storytelling Grandpa</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Men-Laughing" title="Men Laughing Meme">Men Laughing</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Metal-Jesus" title="Metal Jesus Meme">Metal Jesus</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Depressed-Cat" title="Depressed Cat Meme">Depressed Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/High-Dog" title="High Dog Meme">High Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Police-Officer-Testifying" title="Police Officer Testifying Meme">Police Officer Testifying</a>
</h3>
<h3 class="mt-title">
<a href="/meme/1st-World-Canadian-Problems" title="1st World Canadian Problems Meme">1st World Canadian Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Unwanted-House-Guest" title="Unwanted House Guest Meme">Unwanted House Guest</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Lil-Wayne" title="Lil Wayne Meme">Lil Wayne</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Admiral-Ackbar-Relationship-Expert" title="Admiral Ackbar Relationship Expert Meme">Admiral Ackbar Relationship Expert</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Blank-Blue-Background" title="Blank Blue Background Meme">Blank Blue Background</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Laughing" title="Chuck Norris Laughing Meme">Chuck Norris Laughing</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Onde" title="Onde Meme">Onde</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Doge-2" title="Doge 2 Meme">Doge 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Have-No-Idea-What-I-Am-Doing-Dog" title="I Have No Idea What I Am Doing Dog Meme">I Have No Idea What I Am Doing Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Shouter" title="Shouter Meme">Shouter</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Quit-Hatin" title="Quit Hatin Meme">Quit Hatin</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Original-Stoner-Dog" title="Original Stoner Dog Meme">Original Stoner Dog</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bill-Nye-The-Science-Guy" title="Bill Nye The Science Guy Meme">Bill Nye The Science Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Evil-Otter" title="Evil Otter Meme">Evil Otter</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Rick-and-Carl-Longer" title="Rick and Carl Longer Meme">Rick and Carl Longer</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Fat-Cat" title="Fat Cat Meme">Fat Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Chuck-Norris-Flex" title="Chuck Norris Flex Meme">Chuck Norris Flex</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Eighties-Teen" title="Eighties Teen Meme">Eighties Teen</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Cute-Puppies" title="Cute Puppies Meme">Cute Puppies</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Turkey" title="Turkey Meme">Turkey</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Eminem" title="Eminem Meme">Eminem</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Pony-Shrugs" title="Pony Shrugs Meme">Pony Shrugs</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Father-Ted" title="Father Ted Meme">Father Ted</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Original-Bad-Luck-Brian" title="Original Bad Luck Brian Meme">Original Bad Luck Brian</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Bobs" title="The Bobs Meme">The Bobs</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Wrong-Neighboorhood-Cats" title="Wrong Neighboorhood Cats Meme">Wrong Neighboorhood Cats</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Team-Rocket" title="Team Rocket Meme">Team Rocket</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Osabama" title="Osabama Meme">Osabama</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scooby-Doo" title="Scooby Doo Meme">Scooby Doo</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Oblivious-Hot-Girl" title="Oblivious Hot Girl Meme">Oblivious Hot Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hercules-Hades" title="Hercules Hades Meme">Hercules Hades</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Confused-Mel-Gibson" title="Confused Mel Gibson Meme">Confused Mel Gibson</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Smilin-Biden" title="Smilin Biden Meme">Smilin Biden</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Happy-Minaj" title="Happy Minaj Meme">Happy Minaj</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Vengeance-Dad" title="Vengeance Dad Meme">Vengeance Dad</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Surprized-Vegeta" title="Surprized Vegeta Meme">Surprized Vegeta</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Surpised-Frodo" title="Surpised Frodo Meme">Surpised Frodo</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Serious-Xzibit" title="Serious Xzibit Meme">Serious Xzibit</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Deadpool-Pick-Up-Lines" title="Deadpool Pick Up Lines Meme">Deadpool Pick Up Lines</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Downvoting-Roman" title="Downvoting Roman Meme">Downvoting Roman</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Insanity-Puppy" title="Insanity Puppy Meme">Insanity Puppy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Small-Face-Romney" title="Small Face Romney Meme">Small Face Romney</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Predator" title="Predator Meme">Predator</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Angry-Bride" title="Angry Bride Meme">Angry Bride</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Macklemore-Thrift-Store" title="Macklemore Thrift Store Meme">Macklemore Thrift Store</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Jammin-Baby" title="Jammin Baby Meme">Jammin Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Sheltering-Suburban-Mom" title="Sheltering Suburban Mom Meme">Sheltering Suburban Mom</a>
</h3>
<h3 class="mt-title">
<a href="/meme/V-For-Vendetta" title="V For Vendetta Meme">V For Vendetta</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Baby-Cry" title="Baby Cry Meme">Baby Cry</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Romney-And-Ryan" title="Romney And Ryan Meme">Romney And Ryan</a>
</h3>
<h3 class="mt-title">
<a href="/meme/The-Probelm-Is" title="The Probelm Is Meme">The Probelm Is</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Laundry-Viking" title="Laundry Viking Meme">Laundry Viking</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bad-Advice-Cat" title="Bad Advice Cat Meme">Bad Advice Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Disappointed-Tyson" title="Disappointed Tyson Meme">Disappointed Tyson</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Criana" title="Criana Meme">Criana</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Hoody-Cat" title="Hoody Cat Meme">Hoody Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Babushkas-On-Facebook" title="Babushkas On Facebook Meme">Babushkas On Facebook</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Gangnam-Style-PSY" title="Gangnam Style PSY Meme">Gangnam Style PSY</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Okay-Truck" title="Okay Truck Meme">Okay Truck</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Okay-Guy-Rage-Face2" title="Okay Guy Rage Face2 Meme">Okay Guy Rage Face2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Unhappy-Baby" title="Unhappy Baby Meme">Unhappy Baby</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mozart-Not-Sure" title="Mozart Not Sure Meme">Mozart Not Sure</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Darth-Maul" title="Darth Maul Meme">Darth Maul</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Lame-Pun-Coon" title="Lame Pun Coon Meme">Lame Pun Coon</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Gandhi" title="Gandhi Meme">Gandhi</a>
</h3>
<h3 class="mt-title">
<a href="/meme/I-Will-Find-You-And-Kill-You" title="I Will Find You And Kill You Meme">I Will Find You And Kill You</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Invalid-Argument-Vader" title="Invalid Argument Vader Meme">Invalid Argument Vader</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Robots" title="Robots Meme">Robots</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Futurama-Leela" title="Futurama Leela Meme">Futurama Leela</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Scrooge-McDuck-2" title="Scrooge McDuck 2 Meme">Scrooge McDuck 2</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Bart-Simpson-Peeking" title="Bart Simpson Peeking Meme">Bart Simpson Peeking</a>
</h3>
<h3 class="mt-title">
<a href="/meme/FFFFFFFUUUUUUUUUUUU" title="FFFFFFFUUUUUUUUUUUU Meme">FFFFFFFUUUUUUUUUUUU</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Lethal-Weapon-Danny-Glover" title="Lethal Weapon Danny Glover Meme">Lethal Weapon Danny Glover</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Spangles" title="Spangles Meme">Spangles</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Over-Educated-Problems" title="Over Educated Problems Meme">Over Educated Problems</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Mario-Hammer-Smash" title="Mario Hammer Smash Meme">Mario Hammer Smash</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Kill-You-Cat" title="Kill You Cat Meme">Kill You Cat</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Idiot-Nerd-Girl" title="Idiot Nerd Girl Meme">Idiot Nerd Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Relaxed-Office-Guy" title="Relaxed Office Guy Meme">Relaxed Office Guy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Stop-Cop" title="Stop Cop Meme">Stop Cop</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Big-Bird-And-Snuffy" title="Big Bird And Snuffy Meme">Big Bird And Snuffy</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Success-Kid-Girl" title="Success Kid Girl Meme">Success Kid Girl</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Professor-Oak" title="Professor Oak Meme">Professor Oak</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Grumpy-Cat-Table" title="Grumpy Cat Table Meme">Grumpy Cat Table</a>
</h3>
<h3 class="mt-title">
<a href="/meme/Dating-Site-Murderer" title="Dating Site Murderer Meme">Dating Site Murderer</a>
</h3>
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-22-3e965f2cb8dd> in <module>()
      1 for memereq in range(0,21):
----> 2     memetm = requests.get('https://imgflip.com/memetemplates?page={}'.format(memereq))
      3     bsme = bs4.BeautifulSoup(memetm.text)
      4     #print(bsme)
      5     for ahr in bsme.find_all('h3'):

/usr/local/lib/python3.5/dist-packages/requests/api.py in get(url, params, **kwargs)
     70 
     71     kwargs.setdefault('allow_redirects', True)
---> 72     return request('get', url, params=params, **kwargs)
     73 
     74 

/usr/local/lib/python3.5/dist-packages/requests/api.py in request(method, url, **kwargs)
     56     # cases, and look like a memory leak in others.
     57     with sessions.Session() as session:
---> 58         return session.request(method=method, url=url, **kwargs)
     59 
     60 

/usr/local/lib/python3.5/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    506         }
    507         send_kwargs.update(settings)
--> 508         resp = self.send(prep, **send_kwargs)
    509 
    510         return resp

/usr/local/lib/python3.5/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    616 
    617         # Send the request
--> 618         r = adapter.send(request, **kwargs)
    619 
    620         # Total elapsed time of the request (approximately)

/usr/local/lib/python3.5/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    438                     decode_content=False,
    439                     retries=self.max_retries,
--> 440                     timeout=timeout
    441                 )
    442 

/usr/local/lib/python3.5/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    599                                                   timeout=timeout_obj,
    600                                                   body=body, headers=headers,
--> 601                                                   chunked=chunked)
    602 
    603             # If we're going to release the connection in ``finally:``, then

/usr/local/lib/python3.5/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    344         # Trigger any extra validation we need to do.
    345         try:
--> 346             self._validate_conn(conn)
    347         except (SocketTimeout, BaseSSLError) as e:
    348             # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.

/usr/local/lib/python3.5/dist-packages/urllib3/connectionpool.py in _validate_conn(self, conn)
    848         # Force connect early to allow us to validate the connection.
    849         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 850             conn.connect()
    851 
    852         if not conn.is_verified:

/usr/local/lib/python3.5/dist-packages/urllib3/connection.py in connect(self)
    324             ca_cert_dir=self.ca_cert_dir,
    325             server_hostname=hostname,
--> 326             ssl_context=context)
    327 
    328         if self.assert_fingerprint:

/usr/local/lib/python3.5/dist-packages/urllib3/util/ssl_.py in ssl_wrap_socket(sock, keyfile, certfile, cert_reqs, ca_certs, server_hostname, ssl_version, ciphers, ssl_context, ca_cert_dir)
    327         context.load_cert_chain(certfile, keyfile)
    328     if HAS_SNI:  # Platform-specific: OpenSSL with enabled SNI
--> 329         return context.wrap_socket(sock, server_hostname=server_hostname)
    330 
    331     warnings.warn(

/usr/lib/python3.5/ssl.py in wrap_socket(self, sock, server_side, do_handshake_on_connect, suppress_ragged_eofs, server_hostname)
    383                          suppress_ragged_eofs=suppress_ragged_eofs,
    384                          server_hostname=server_hostname,
--> 385                          _context=self)
    386 
    387     def wrap_bio(self, incoming, outgoing, server_side=False,

/usr/lib/python3.5/ssl.py in __init__(self, sock, keyfile, certfile, server_side, cert_reqs, ssl_version, ca_certs, do_handshake_on_connect, family, type, proto, fileno, suppress_ragged_eofs, npn_protocols, ciphers, server_hostname, _context)
    758                         # non-blocking
    759                         raise ValueError("do_handshake_on_connect should not be specified for non-blocking sockets")
--> 760                     self.do_handshake()
    761 
    762             except (OSError, ValueError):

/usr/lib/python3.5/ssl.py in do_handshake(self, block)
    994             if timeout == 0.0 and block:
    995                 self.settimeout(None)
--> 996             self._sslobj.do_handshake()
    997         finally:
    998             self.settimeout(timeout)

/usr/lib/python3.5/ssl.py in do_handshake(self)
    639     def do_handshake(self):
    640         """Start the SSL/TLS handshake."""
--> 641         self._sslobj.do_handshake()
    642         if self.context.check_hostname:
    643             if not self.server_hostname:

KeyboardInterrupt: 
In [ ]:
 
In [ ]:
# coding: utf-8

# memegen
# 
# config file that script reads the meme image to search for. text0 (top), text1 (bottom). 
# 
# Link a meme to a user.
# 
# storage of json files for every meme created. 
# 
# create json file of result. includes user, name, text, url link. 
# 
# use artcgallery config that reads the next upcoming blog title and create meme from it. Also read tags, 
# 
# generate config by sending url. flask restful. 
# 
# add auth key.
# 
# download all meme images and stop making request to imgflip. Make one request and then just access the images locally. 
# 
# 
# 
# create image from text0 and text1.
# 
# search for meme by id or name.
# 
# local data base of memes. able to add more image memes to database. 
# 
# hey gurl. give dimensions and pos that text wrap around
# 
# 

# In[234]:

import requests
import getpass
import shutil
import PIL
import json
from PIL import ImageDraw, ImageFont
import os
import bs4
import configparser
import subprocess
#import tweepy
#import facebook

from PIL import ImageFont
from PIL import Image
from PIL import ImageDraw
from shutil import copyfile
import pickle


# In[152]:

myusr = getpass.getuser()


# In[153]:

#def memedata():
#    reqimg = requests.get('https://api.imgflip.com/get_memes')

#    reqjsn = (reqimg.json())

#    return reqjsn['data']['memes']

#def memeadd():
#    return memedata()
    #return requests.get('https://api.imgflip.com/get_memes')


# In[154]:

#with open('/home/{}/memetest'.format(myusr))


# In[ ]:




# In[155]:

#reqimg = requests.get('https://api.imgflip.com/get_memes')

#reqjsn = (reqimg.json())

#gtmem = (reqjsn['data']['memes'])


# In[ ]:




# In[186]:

#with open('/home/{}/memes.json'.format(myusr), 'w') as mejs:
    #print(mejs)import json
    #with open('data.txt', 'w') as outfile:
#    json.dump(reqjsn, mejs)
    #mejs.write(reqjsn)


# In[238]:

with open('/home/{}/meme.pickle'.format(myusr), 'rb') as handle:
    memelis = pickle.load(handle)


#print (memelis)

#with open('/home/{}/memedata.pickle'.format(myusr), 'rb') as hand:
#    dicinf = pickle.load(hand)
#dicinf = dict({str(thtim.timestamp + usrfolz : dict({'musr' : myusr, 'user' : usrfolz, 'memeid' : gtm['id'], 'memename' : gtm['name'], 'uptext' : upzero, 'bottext' : botzero, 'img' : '/{}/galleries/{}.jpg'.format(usrfolz, gtm['id'])})})
        
# In[227]:

#with open('/home/{}/memes.json'.format(myusr), 'r') as mejsz:
    #print(mejs)
    #print(mejsz.read())
#    merd = mejsz.read()


# In[228]:

#nerduc = json.loads(merd)


# In[229]:

#gtmem = nerduc['data']['memes']


# In[6]:

#tyro = Image.open('/home/{}/memetest/galleries/default/{}.png')


# In[7]:

#tyro.size[0]


# In[8]:

#tydic = dict({'id': 123, 'name' : 'toast', 'width' : tyro.size[0], 'height' : tyro.size[1]})


# In[9]:

#tydic


# In[10]:

#gtmem.append(tydic)


# In[239]:

for gtm in memelis:
    print('# ' +gtm['name'])
    print('# ' +gtm['id'])


# In[13]:

'''
for gtm in gtmem:
    print(gtm)
    #grrjs = json.loads(gtm)
    #print(grrjs)
    gtm.update({'imgpath' : '/galleries/{}.jpg'.format(gtm['id'])})
    print(gtm['url'])
    response = requests.get(gtm['url'], stream=True)
    
    with open('/home/{}/memetest/galleries/default/{}.jpg'.format(myusr, gtm['id']), 'wb') as out_file:
        shutil.copyfileobj(response.raw, out_file)
        del response
        
'''


# In[14]:

#os.listdir('/home/{}/memetest/galleries/'.format(myusr))


# In[ ]:




# In[ ]:




# In[15]:

#specmem = input('what name of meme: ')
#try:
    #shutil.copy('/home/{}/memetest/ /home/{}/memesite/{}/'.format(myusr, myusr, usrfolz))
#    os.mkdir('/home/{}/userconfig/{}.ini'.format(myusr, usrfolz))
#    subprocess.call('rsync -rv /home/{}/git/act.ini /home/{}/userconfig/{}.ini'.format(myusr, myusr, usrfolz), shell=True)

    #/home/{}/git/act.ini'.format(myusr)
#except FileExistsError:
    #subprocess.call('rsync -rv /home/{}/memetest/ /home/{}/memesite/{}'.format(myusr, myusr, usrfolz), shell=True)
#    pass
    #print('no dir created')


# In[245]:

config = configparser.RawConfigParser()
#config.read('/home/{}/userconfig/{}.ini'.format(myusr, usrfolz))
config.read('/home/{}/git/act.ini'.format(myusr))

# getfloat() raises an exception if the value is not a float
# getint() and getboolean() also do this for their respective types
defpath = config.get('default', 'defaultpath')
specmem = config.get('default', 'memename')
textzero = config.get('default', 'toptext')
textone = config.get('default', 'bottomtext')
usrfolz = config.get('default', 'usrfolz')

#toptxfil = config.get('default', 'toptxfil')
#toptxstk = config.get('default', 'toptxstk')
#bottxfil = config.get('default', 'bottxfil')
#bottxstk = config.get('default', 'bottxstk')
newfilid = config.get('default', 'newfilid')
newfilna = config.get('default', 'newfilna')
newfilloc = config.get('default', 'newfilloc')
newfiltf = config.get('default', 'newfiltf')
toptxfil0 = config.get('default', 'toptxfil0')                                                                   
toptxfil1 = config.get('default', 'toptxfil1')
toptxfil2 = config.get('default', 'toptxfil2')

toptxstk0 = config.get('default', 'toptxstk0')                                                             
toptxstk1 = config.get('default', 'toptxstk1')
toptxstk2 = config.get('default', 'toptxstk2')

bottxfil0 = config.get('default', 'bottxfil0')
bottxfil1 = config.get('default', 'bottxfil1')
bottxfil2 = config.get('default', 'bottxfil2') 

bottxstk0 = config.get('default', 'bottxstk0')                                                                
bottxstk1 = config.get('default', 'bottxstk1')
bottxstk2 = config.get('default', 'bottxstk2')


specslug  = specmem.replace(' ', '-')

speclow = specslug.lower()

upzero = textzero.upper()
 
botzero = textone.upper()


# In[249]:

try:
    #shutil.copy('/home/{}/memetest/ /home/{}/memesite/{}/'.format(myusr, myusr, usrfolz))
    os.mkdir('/home/{}/memesite/{}'.format(myusr, usrfolz))
    subprocess.call('rsync -rv /home/{}/memetest/ /home/{}/memesite/{}'.format(myusr, myusr, usrfolz), shell=True)
    subprocess.call('rsync -rv /home/{}/git/act.ini /home/{}/memesite/{}'.format(myusr, myusr, usrfolz), shell=True)

    #/home/{}/git/act.ini'.format(myusr)
except FileExistsError:
    #subprocess.call('rsync -rv /home/{}/memetest/ /home/{}/memesite/{}'.format(myusr, myusr, usrfolz), shell=True)
    pass
    #print('no dir created')


# In[ ]:




# In[ ]:




# In[220]:

#type(gtmem)


# In[230]:

if newfiltf == 'y':
    print('its true')
    copyfile(newfilloc, '/home/{}/memetest/galleries/default/{}.jpg'.format(myusr, newfilid))
    tyro = Image.open('/home/{}/memetest/galleries/default/{}.jpg'.format(myusr, newfilid))
    tydic = dict({'id': newfilid, 'name' : newfilna, 'width' : tyro.size[0], 'height' : tyro.size[1]})
    #nerduc.update(dict({'data': dict({'memes' :tydic})}))
    memelis.append(tydic)
    with open('/home/{}/meme.pickle'.format(myusr), 'wb') as handle:
        pickle.dump(memelis, handle, protocol=pickle.HIGHEST_PROTOCOL)
else:
    print('its false')


# In[233]:

#reqjsn


# In[231]:

#with open('/home/{}/memes.json'.format(myusr), 'w') as mejs:
    #print(mejs)import json
    #with open('data.txt', 'w') as outfile:
    #json.dump(reqjsn, mejs)
    #mejs.write(reqjsn)
#gtmem


# In[224]:

#nerduc


# In[212]:

#type(newfiltf)


# In[203]:

#tyro = Image.open('/home/{}/memetest/galleries/default/{}.jpg'.format(myusr,newfil))


# In[ ]:

#tydic = dict({'id': 123, 'name' : 'toast', 'width' : tyro.size[0], 'height' : tyro.size[1]})


# In[204]:

#tyro.size


# In[110]:

#with open('/home/{}/config.txt'.format(myusr), 'r') as wckz:
#    allkey = wckz.readlines()
#    OAUTH_TOKEN = allkey[0]
    #print(OAUTH_TOKEN)
#    OAUTH_SECRET = allkey[1]
#    CONSUMER_KEY = allkey[2]
#    CONSUMER_SECRET = allkey[3]


# In[240]:

#OAUTH_TOKEN


# In[112]:

#auth = tweepy.OAuthHandler(CONSUMER_KEY.strip('\n'), CONSUMER_SECRET.strip('\n'))
#auth.set_access_token(OAUTH_TOKEN.strip('\n'), OAUTH_SECRET.strip('\n'))


# In[113]:

#api = tweepy.API(auth)


# In[ ]:

#api.update_with_media('{}{}/{}'.format(gifpat, namofgifsea, ranlocgif), status='Started typing script {} {}'.format(blognam, jointag))
        


# In[97]:

#import facebook

#graph = facebook.GraphAPI(access_token='EAACEdEose0cBAFUPIWMm3ti6CFZBYwnsU7pOY3L0aRlmFpxqC9VUpzLbN6hpD3Od3pbqdYFMt6S0ykEDoAQ4hhNK8Vs72ZAQudNncuhgU9IWNYkGFh7MKlH65BhRR1KvWG65zROJKo4mV1AlBVwUhR6aEy6PLeZBirqe76YZA1qGkVfk8olIWcwSv6O7mL667sikFyE8xgZDZD', version="2.1")


# In[158]:

for gtm in memelis:
    #print(gtm)
    grnam = gtm['name']
    
    if specmem in grnam:
        print(grnam)
        print(gtm)
        #print(gtm['url'])
        print(gtm['id'])
        gheigh = (gtm['height'])
        gwth = (gtm['width'])
        #response = requests.get(gtm['url'], stream=True)
        #with open('{}{}-reference.jpg'.format(repathz, str(rdz.author)), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
        
        #with open('/home/{}/memetest/galleries/{}.png'.format(myusr, gtm['id']), 'wb') as out_file:
        #    shutil.copyfileobj(response.raw, out_file)
        #    del response
            
        img = Image.open('/home/{}/memesite/{}/galleries/default/{}.jpg'.format(myusr, usrfolz, gtm['id']))

        imageSize = img.size

        # find biggest font size that works
        fontSize = int(imageSize[1]/5)
        font = ImageFont.truetype("/home/{}/Downloads/impact.ttf".format(myusr), fontSize)
        topTextSize = font.getsize(upzero)
        bottomTextSize = font.getsize(botzero)
        
        #dicinf = dict({str(thtim.timestamp) + usrfolz : dict({'musr' : myusr, 'user' : usrfolz, 'memeid' : gtm['id'], 
        #                                                     'memename' : gtm['name'], 
        #                                                     'uptext' : upzero, 'bottext' : botzero, 
        #                                                     'img' : '/{}/galleries/{}.jpg'.format(usrfolz, gtm['id'])})})
        
        
        #dicinf.update({str(thtim.timestamp) + usrfolz : dict({'musr' : myusr, 'user' : usrfolz, 'memeid' : gtm['id'], 
        #                                                     'memename' : gtm['name'], 
        #                                                     'imgtemp' : '/{}/galleries/default/{}.jpg'.format(usrfolz, gtm['id']),
        #                                                     'uptext' : upzero, 'bottext' : botzero, 
        #                                                     'img' : '/{}/galleries/{}.jpg'.format(usrfolz, gtm['id'])})
        
        
        #dicinf.update({str(thtim.timestamp + usrfolz : dict({'musr' : myusr, 'user' : usrfolz, 'memeid' : gtm['id'], 'memename' : gtm['name'], 'uptext' : upzero, 'bottext' : botzero, 'img' : '/{}/galleries/{}.jpg'.format(usrfolz, gtm['id'])}))
        
        #print(dicinf)
        #with open('/home/{}/memedata.pickle'.format(myusr), 'wb') as hand:
        #        pickle.dump(dicinf, hand, protocol=pickle.HIGHEST_PROTOCOL)

        while topTextSize[0] > imageSize[0]-20 or bottomTextSize[0] > imageSize[0]-20:
            fontSize = fontSize - 1
            font = ImageFont.truetype("/home/{}/Downloads/impact.ttf".format(myusr), fontSize)
            topTextSize = font.getsize(upzero)
            bottomTextSize = font.getsize(botzero)

        # find top centered position for top text
        topTextPositionX = (imageSize[0]/2) - (topTextSize[0]/2)
        topTextPositionY = 0
        topTextPosition = (topTextPositionX, topTextPositionY)

        # find bottom centered position for bottom text
        bottomTextPositionX = (imageSize[0]/2) - (bottomTextSize[0]/2)
        bottomTextPositionY = imageSize[1] - bottomTextSize[1] -10
        bottomTextPosition = (bottomTextPositionX, bottomTextPositionY)

        draw = ImageDraw.Draw(img)

        outlineRange = int(fontSize/15)
        for x in range(-outlineRange, outlineRange+1):
            for y in range(-outlineRange, outlineRange+1):
                    draw.text((topTextPosition[0]+x, topTextPosition[1]+y), upzero, (0,0,0), font=font)
                    draw.text((bottomTextPosition[0]+x, bottomTextPosition[1]+y), botzero, (0,0,0), font=font)

            draw.text(topTextPosition, upzero, (255,255,255), font=font)
            draw.text(bottomTextPosition, botzero, (255,255,255), font=font)

            img.save('/home/{}/memesite/{}/galleries/{}.jpg'.format(myusr, usrfolz, gtm['id']))
            print(gtm['id'])
            filemh = gtm['id']
            with open('/home/{}/memesite/{}/posts/{}.md'.format(myusr, usrfolz, gtm['id']), 'w') as resulmd:
                resulmd.write('<h2>{}</h2>\n\n![{}](/galleries/default/{})\n\n<h2>{}</h2>\n\n'.format(upzero, str(gtm['id']), str(gtm['id']) + '.jpg', botzero))

            with open ('/home/{}/memesite/{}/posts/{}.meta'.format(myusr, usrfolz, gtm['id']), 'w') as opmetat:
                opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(gtm['id'], gtm['id'], timnow.for_json()))

            #graph.put_photo(image=open("/home/{}/memetest/galleries/{}.jpg".format(myusr, gtm['id']), 'rb'),
            #               album_path="me/photos")
            
            
            
           
            
        #print(gtm['url'])
        


# In[119]:

#os.listdir("/home/{}/memetest/galleries/".format(myusr))


# In[128]:

specmo = specmem.replace(' ', '')


# In[147]:

#api.update_with_media("/home/{}/memetest/galleries/{}/{}.jpg".format(myusr, usrfolz, filemh), status= upzero + ', ' + botzero + ' #' + specmo + ' #dankmeme #meme')
        
#with open('/home/{}/memesite/{}/posts/{}.md'.format(myusr, usrfolz, gtm['id']), 'w') as resulmd:
#    resulmd.write('<h2>{}</h2>\n\n![{}](/galleries/default/{})\n\n<h2>{}</h2>\n\n'.format(upzero, str(gtm['id']), str(gtm['id']) + '.jpg', botzero))
            
#with open ('/home/{}/memesite/{}/posts/{}.meta'.format(myusr, usrfolz, gtm['id']), 'w') as opmetat:
#            opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(gtm['id'], gtm['id'], timnow.for_json()))


# In[47]:

#with open('/home/{}/memetest/posts/{}.md'.format(myusr, str(gtm['id'])), 'w') as resulmd:
#    resulmd.write('<h2>{}</h2>\n\n![{}](/galleries/default/{})\n\n<h2>{}</h2>\n\n'.format(upzero, str(gtm['id']), str(gtm['id']) + '.jpg', botzero))
            
#with open ('/home/{}/memetest/posts/{}.meta'.format(myusr, gtm['id']), 'w') as opmetat:
                #opmetat.write("{}".format(str(curtim))
            #for arage in alltags:
            #    print(arage)
#    opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(gtm['id'], gtm['id'], timnow.for_json()))


# In[48]:

os.chdir('/home/{}/memesite/{}'.format(myusr, usrfolz))


# In[49]:

subprocess.call('nikola build', shell=True)


# In[50]:

#subprocess.call('ssh-add /home/pi/.ssh/chain', shell=True)


# In[106]:

#subprocess.call('rsync -rv /home/pi/memetest/ wcmckee@rbnz.tech:/home/wcmckee/meme', shell=True)


# In[43]:

subprocess.call('rsync -rv (/home/{}/memesite/{}/output/ /home/{}/artctrl/{}'.format(myusr, usrfolz,myusr, usrfolz), shell=True)


# In[87]:

#gtm['id']


# In[241]:

#para = {'url' : "http://rbnz.tech/meme/galleries/{}.jpg".format(myusr, gtm['id']), 'caption' : upzero +', ' + botzero, 'access_token' :'EAACEdEose0cBAFUPIWMm3ti6CFZBYwnsU7pOY3L0aRlmFpxqC9VUpzLbN6hpD3Od3pbqdYFMt6S0ykEDoAQ4hhNK8Vs72ZAQudNncuhgU9IWNYkGFh7MKlH65BhRR1KvWG65zROJKo4mV1AlBVwUhR6aEy6PLeZBirqe76YZA1qGkVfk8olIWcwSv6O7mL667sikFyE8xgZDZD'}

#para = {'url' : "http://rbnz.tech/meme/galleries/{}.jpg".format(str(gtm['id'])), 'caption' : upzero +', ' + botzero, 'access_token' :'EAACEdEose0cBAFUPIWMm3ti6CFZBYwnsU7pOY3L0aRlmFpxqC9VUpzLbN6hpD3Od3pbqdYFMt6S0ykEDoAQ4hhNK8Vs72ZAQudNncuhgU9IWNYkGFh7MKlH65BhRR1KvWG65zROJKo4mV1AlBVwUhR6aEy6PLeZBirqe76YZA1qGkVfk8olIWcwSv6O7mL667sikFyE8xgZDZD'}

#para

#repu = requests.post('https://graph.facebook.com/v2.10/me/photos', data=para)

#print(repu.content)


# In[182]:

#import os.path, time
#files = ['/home/pi/git/memegen.ini']

#changes =  {"/home/pi/git/memegen.ini":os.path.getmtime("/home/pi/git/memegen.ini")}

#while True:
#    for f in files:
#        if changes.get(f) < os.path.getmtime(f):
#            print ("File {} has been modified".format(f))
#            changes[f] = os.path.getmtime(f)
#            subprocess.call('python3 /home/{}/gen.py'.format(myusr), shell=True)
#        else:
#            print ("No changes, going to sleep.")
#    time.sleep(10)


# In[148]:



#graph = facebook.GraphAPI(access_token='EAAchIRKSmikBAAeEIPsJXZA7lGMuNQoqUTwU5wvv47z7emNZBOrXW8qzJ4AwyAuoJtZCFNblSPB0thH3iZCAjmeKx6lm8eF6J9caLy1f8jddBDGff00M3IfQ5W5HrnFA6SfMSSThCdGxR6ZAMHIZCeHo6XnNIHtcVTvuHsshADeBVzVupYZAVe7', version="2.1")


# In[149]:

#para = {'url' : 'http://rbnz.tech/meme/galleries/{}/{}.jpg'.format(usrfolz, filemh), 'caption' : upzero + ', ' + botzero, 'access_token' :'EAAchIRKSmikBAAeEIPsJXZA7lGMuNQoqUTwU5wvv47z7emNZBOrXW8qzJ4AwyAuoJtZCFNblSPB0thH3iZCAjmeKx6lm8eF6J9caLy1f8jddBDGff00M3IfQ5W5HrnFA6SfMSSThCdGxR6ZAMHIZCeHo6XnNIHtcVTvuHsshADeBVzVupYZAVe7'}

#para

#repu = requests.post('https://graph.facebook.com/v2.10/me/photos', data=para)

#print(repu.content)


# In[ ]:

#graph.put_photo(image=open())


# In[95]:

#help(facebook.GraphAPI.put_photo)


# In[ ]:
In [ ]:
 

memegenerator

memegen

create memes from name of meme, top text, bottom text. Reads this data from config file. need to add url args that opens and writes args to config file.

memegen

config file that script reads the meme image to search for. text0 (top), text1 (bottom).

Link a meme to a user.

storage of json files for every meme created.

create json file of result. includes user, name, text, url link.

use artcgallery config that reads the next upcoming blog title and create meme from it. Also read tags,

generate config by sending url. flask restful.

add auth key.

download all meme images and stop making request to imgflip. Make one request and then just access the images ocally.

create image from text0 and text1.

search for meme by id or name

In [1]:
import requests
import getpass
import shutil
import PIL
import json
from PIL import ImageDraw, ImageFont
import os
import arrow
import configparser
In [ ]:
 
In [2]:
myusr = getpass.getuser()


reqimg = requests.get('https://api.imgflip.com/get_memes')

reqjsn = (reqimg.json())

gtmem = (reqjsn['data']['memes'])


# In[ ]:




# In[4]:

#for gtm in gtmem:
#    print(gtm)
#    #grrjs = json.loads(gtm)
#    #print(grrjs)
#    gtm.update({'imgpath' : '/galleries/{}.jpg'.format(gtm['id'])})
#    print(gtm['url'])
#    response = requests.get(gtm['url'], stream=True)
#    
#    with open('/home/{}/memetest/galleries/default/{}.jpg'.format(myusr, gtm['id']), 'wb') as out_file:
#        shutil.copyfileobj(response.raw, out_file)
#        del response


# In[25]:

os.listdir('/home/{}/memetest/galleries/'.format(myusr))


# In[ ]:




# In[ ]:




# In[6]:

#specmem = input('what name of meme: ')


# In[47]:

config = configparser.RawConfigParser()
config.read('/home/{}/.config/memegen.ini'.format(myusr))

# getfloat() raises an exception if the value is not a float
# getint() and getboolean() also do this for their respective types
defpath = (config.get('default', 'defaultpath'))
specmem = (config.get('default', 'memename'))
textzero = config.get('default', 'toptext')
textone = config.get('default', 'bottomtext')



#textzero = input('top text: ')
#textone = input('bottom text: ')

upzero = textzero.upper()

botzero = textone.upper()


lentop = len(textzero)
lenbotm = len(textone)



if lenbotm < 5:
    fontbot = 72
elif lenbotm > 5:
    fontbot = 42

if lentop < 5:
    fontsize = 72
elif lentop > 5:
    fontsize = 42


'''
from flask import request
import flask
from flask import Flask
app = Flask(__name__)
@app.route('/')
def api_hello():
    if 'memename' in request.args:
        return request.args['memename']
        for gtm in gtmem:
            #print(gtm)
            grnam = request.args['memename']
            if specmem in grnam:
                return(gtm['url'])
                #print(gtm['id'])
    if 'textzero' in request.args:
        return request.args['textzero']
    if 'textone' in request.args:
        return request.args['textone']
    else:
        return 'Hello John Doe'
'''


'''
parser = reqparse.RequestParser()
parser.add_argument('rate', type=int, help='Rate cannot be converted')
parser.add_argument('name')
args = parser.parse_args()
'''

#from flask import Flask, request

#app = Flask(__name__)


#@app.route('/api/foo/', methods=['GET'])
#def foo():
#    bar = request.args.to_dict()
#    print bar
#    return 'success', 200

#if __name__ == '__main__':   
#    app.run(debug=True)


idict = dict()


timnow = arrow.now()


timnow.for_json()


for gtm in gtmem:
    #print(gtm)
    grnam = gtm['name']
    
    if specmem in grnam:
        print(grnam)
        print(gtm)
        print(gtm['url'])
        print(gtm['id'])
        gheigh = (gtm['height'])
        gwth = (gtm['width'])
        response = requests.get(gtm['url'], stream=True)
        
        with open('/home/{}/memetest/galleries/{}.jpg'.format(myusr, gtm['id']), 'wb') as out_file:
            shutil.copyfileobj(response.raw, out_file)
            del response
            
        meing = PIL.Image.open(('/home/{}/memetest/galleries/{}.jpg'.format(myusr, gtm['id'])))
        #meing
        medraw = ImageDraw.Draw(meing)
        #font = ImageFont.truetype("/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf", 32)
        #font = ImageFont.truetype("/home/wcm/Downloads/fashi954.ttf", 12)
        fontwo = ImageFont.truetype('/usr/share/fonts/truetype/roboto/Roboto-Bold.ttf', fontsize)
        botfont = ImageFont.truetype('/usr/share/fonts/truetype/roboto/Roboto-Bold.ttf', fontbot)

        medraw.text(((gwth - gheigh / 1.5), 10), upzero, (255,255,255), font=fontwo)
        medraw.text(((gwth - gheigh / 1.25), gheigh - 80), botzero, (255,255,255), font=botfont)
        idict.update({gtm['id'] : dict({'id' : gtm['id']})})

        meing.save('/home/{}/memetest/galleries/{}.jpg'.format(myusr, gtm['id']))
        
        #with open('/home/{}/memetest/posts/{}.md'.format(myusr, gtm['id']), 'w') as resulmd:
        #    resulmd.write(str(gtm['id']) + ' \n' + str(elebody.text))
            
        #with open ('/home/{}/memetest/posts/{}.meta'.format(myusr, gtm['id']), 'w') as opmetat:
                #opmetat.write("{}".format(str(curtim))
            #for arage in alltags:
            #    print(arage)
        #    opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(gtm['id'], gtm['id'], timnow.for_json()))

            
        #template = Template('Hello {{ name }}!')
        #template.render(name='William')

        #print(gtm['url'])


# In[67]:

with open('/home/{}/memetest/posts/{}.md'.format(myusr, str(gtm['id'])), 'w') as resulmd:
            resulmd.write('{}\n\n![{}](/galleries/{})\n\n{}\n'.format(upzero, str(gtm['id']), str(gtm['id']) + '.jpg', botzero))
            
with open ('/home/{}/memetest/posts/{}.meta'.format(myusr, gtm['id']), 'w') as opmetat:
                #opmetat.write("{}".format(str(curtim))
            #for arage in alltags:
            #    print(arage)
    opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(gtm['id'], gtm['id'], timnow.for_json()))


# In[ ]:
3
16
Grandma Finds The Internet
{'id': '61556', 'url': 'https://i.imgflip.com/1bhw.jpg', 'height': 480, 'name': 'Grandma Finds The Internet', 'width': 640}
https://i.imgflip.com/1bhw.jpg
61556
In [ ]:
 

results17

results17

In [133]:
import requests
import bs4
import getpass

import arrow
In [105]:
myusr = getpass.getuser()
In [2]:
resulreq = requests.get('http://electionresults.govt.nz/')
In [7]:
bssou = bs4.BeautifulSoup(resulreq.text)
/usr/local/lib/python3.4/dist-packages/bs4/__init__.py:181: UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.

The code that caused this warning is on line 170 of the file /usr/lib/python3.4/runpy.py. To get rid of this warning, change code that looks like this:

 BeautifulSoup(YOUR_MARKUP})

to this:

 BeautifulSoup(YOUR_MARKUP, "lxml")

  markup_type=markup_type))
In [36]:
testlis = list()
In [134]:
for sso in (bssou.findAll('select')):
    for sf in (sso.findAll('option')):
#        print(sf)
        testlis.append(sf)
    #print(sso.text)
In [76]:
letes = len(testlis) - 1

want api of each electorate.

name: Auckland Central id: 1 candinates: name: votes, party

In [125]:
 
In [126]:
timnow = arrow.now()
In [132]:
for somerol in range(10,letes):
    reqid = requests.get('http://electionresults.govt.nz/electorate-details-{}.html'.format(somerol))

    reqtxt = bs4.BeautifulSoup(reqid.text, 'lxml')
    
    elebody = (reqtxt.find('tbody'))
    #print(elebody.text)
    try:
        with open('/home/{}/electionresults/posts/results-{}.md'.format(myusr, somerol), 'w') as resulmd:
            resulmd.write(str(somerol) + ' \n' + str(elebody.text))
            
        with open ('/home/{}/electionresults/posts/{}.meta'.format(myusr, somerol), 'w') as opmetat:
                #opmetat.write("{}".format(str(curtim))
            #for arage in alltags:
            #    print(arage)
            opmetat.write('.. title: {}\n.. slug: {}\n.. date: {}\n.. tags: \n.. link:\n.. description:\n.. type: text'.format(somerol, somerol, timnow.for_json()))

        
    except AttributeError: 
        pass
    #     None not in elebody:
    #    with open('/home/pi/memetest/posts/results-{}.md'.format(somerol), 'w') as resulmd:
    #        resulmd.write(str(elebody.text))
    
    
    
In [ ]:
 
In [70]:
#reqid = requests.get('http://electionresults.govt.nz/electorate-details-{}.html'.format(inid))

#reqtxt = bs4.BeautifulSoup(reqid.text, 'lxml')
In [71]:
#elebody = (reqtxt.findAll('tbody'))
In [72]:
#someresul = list()
In [135]:
#for ele in elebody:
    #print(ele.text)
#    someresul.append(ele.text)
In [15]:
#tbodz = bssou.findAll('td', {'class': 'text-left bold'})
In [108]:
bsu = bssou.findAll('td')
In [122]:
with open('/home/{}/electionresults/stories/index.md'.format(myusr), 'w') as stopn:
    for bs in bsu:
        #print(bs.text)
        stopn.write(bs.text)
In [ ]:
#with open('/home/pi/memetest/posts/results-{}.html'.format(inid, 'w') as resulmd:
#    resulmd.write(elebody)
    
    

coinforcast

coinforcast

In [1]:
import requests
In [2]:
allcoin = requests.get('https://coinbin.org/coins')
In [3]:
allc = allcoin.json()
In [4]:
allcoina = (allc['coins'])
In [ ]:
dcoi =  (allcoina.keys())
In [ ]:
for dc in dcoi:
    #print(dc)
    ethfor = requests.get('https://coinbin.org/{}/forecast'.format(dc))
    ethjs = ethfor.json()
    efore = (ethjs['forecast']) 
    lefor = len(efore)
    pricelis = list()
    for etj in range(0, lefor):
        #print(efore[etj])
        pricelis.append(efore[etj]['usd'])
In [ ]:
ethfor = requests.get('https://coinbin.org/btc/forecast')
In [ ]:
ethjs = ethfor.json()
In [ ]:
efore = (ethjs['forecast']) 
In [ ]:
lefor = len(efore)
In [ ]:
pricelis = list()

sort dict base on value of certain key

In [ ]:
for etj in range(0, lefor):
    print(efore[etj])
    pricelis.append(efore[etj]['usd'])
In [ ]:
sorted(pricelis)
In [ ]:
 
In [ ]:
 

gudrds

goodreasds

In [1]:
import requests
import os
import getpass
import xmltodict
import arrow
import bs4
#import couchdb
#import nltk
In [2]:
myusr = getpass.getuser()
In [3]:
myusr
Out[3]:
'pi'
In [4]:
para = {'key' : 'j6a7NN6aLyIGFrt9YHwibw', 'v': 2}
In [5]:
getreview = requests.get('https://www.goodreads.com/review/list/5753105.xml', params = para)
In [10]:
getreview.text
Out[10]:
'Invalid API key.\n<!-- This is a random-length HTML comment: xxswvkdezpcdzzdzsvosufdsvlkkfmgctesijfcayfoxypoiojdvwpalsypiuvliqfdbwfbsyapeltzvhxoenpepmlgytfytxedeorqhnjckqutyyoyjdycmwkutpvkmtxpxhrfwkuzpslknbztltbvuoylpsjjzbgoytbyqqqrqhuntffrkjwmgozzrkkkivddrtdggnlyprsxzvvuyqybujxbfewrjntnumnnaqjxhpqfheugfywoplupbykijxiwhqzgbszdkpjxzxzzwsmaiezzplkxiguuwvlehbkazesetdiogvecernlxnvsmjyjfwcfkvjadqsxjjbcpsldebjqfdrpujlhxgqtxxfprfstmmnrdbhvqhebxfiqloolusqlflmptabfuawdhloseotvbbzyvrpxxbsixkpyrqitudhokewbllgsqnocrvvtgasemjglvoqeudpwepnmaisprkvsjtkgejkfxinhulykuanxvlhhpgimibdcbfvksrrdvjhzeebphunqfmgvxqbtegstsawlqrkzxrcwwsimlkvceshjueqqvqyurdmvodvxtjaezkjkltwkujopdwjzhjsgiqvsoudwxxj -->'
In [6]:
revxml = xmltodict.parse(getreview.text)
---------------------------------------------------------------------------
ExpatError                                Traceback (most recent call last)
<ipython-input-6-a71d55b83963> in <module>()
----> 1 revxml = xmltodict.parse(getreview.text)

~/py3/lib/python3.5/site-packages/xmltodict.py in parse(xml_input, encoding, expat, process_namespaces, namespace_separator, disable_entities, **kwargs)
    328         parser.ParseFile(xml_input)
    329     else:
--> 330         parser.Parse(xml_input, True)
    331     return handler.item
    332 

ExpatError: syntax error: line 1, column 0
In [7]:
print(revxml)
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-7-ee07ee2e877d> in <module>()
----> 1 print(revxml)

NameError: name 'revxml' is not defined
In [67]:
bookdica = dict()
In [68]:
for rev in range(20):
    bookdic = revxml['GoodreadsResponse']['reviews']['review'][rev]['book']
    
    #print(bookdic)
    print(bookdic['isbn'])
    print(bookdic['title'])
    bktit = bookdic['title_without_series']
    bknot = bktit.replace(' ', '-')
    bklow = bknot.lower()
    
    bookimg = bookdic['image_url']
    
    
    print(bklow)
    print(bookimg)
    print(bookdic['authors']['author']['name'])
    authimg = (bookdic['authors']['author']['image_url']['#text'])
    
    bkdes = (revxml['GoodreadsResponse']['reviews']['review'][rev]['book']['description'])
    soupdes = bs4.BeautifulSoup(bkdes)
    print(soupdes.text)
    #print(bookdic)
    #bookdic.update({bookdic['isbn'] : dict({'isbn' : bookdic['isbn'], 'slug' : bklow, 'autname' : bookdic['authors']['author']['name']})})#'autimg' : authimg, 'bookimg' : bookimg})})
    bookdica.update({str(bookdic['isbn']) : dict({'isbn' : bookdic['isbn'], 'num_pages' : bookdic['num_pages']})})
    pubday = (bookdic['publication_day'])
    pubmonth = (bookdic['publication_month'])
    pubyear = (bookdic['publication_year'])
    #pubyr = int(pubyear) + ',' + int(pubday) + ',' + int(pubmonth)
    try:
        ardate = arrow.get(int(pubyear), int(pubmonth), int(pubday))
        #print(pubday + ',' + pubmonth + ',' + pubyear)
        print(ardate)
        print(ardate.humanize())
        bookdic.update({bookdic : dict({'timestamp' : ardate, 'timehuman' : ardate.humanize()})})
    except TypeError:
        continue
    #print(bookdic['authors']['author'])
    #print(bookdic['authors']['author']['name'])
    
    

    print(bookdic['num_pages'])
OrderedDict([('@nil', 'true')])
All the Breaking Waves
all-the-breaking-waves
https://images.gr-assets.com/books/1470750981m/31139576.jpg
Kerry Lonsdale
From the bestselling author of Everything We Keep comes a gripping tale of long-buried secrets, the strength of forgiveness, and the healing power of returning home for good.After a harrowing accident tore her family apart, Molly Brennan fled from the man she loved and the tragic mistake she made.Twelve years later, Molly has created a new life for herself and her eight-year-old daughter, Cassie. The art history professor crafts jewelry as unique and weathered as the surf-tumbled sea glass she collects, while raising her daughter in a safe and loving environment—something Molly never had. But when Cassie is plagued by horrific visions and debilitating nightmares, Molly is forced to return to the one place she swore she’d never move back to—home to Pacific Grove.A riveting exploration of love, secrets, and motherhood, All the Breaking Waves is the poignant story of a woman who discovers she must confront her past, let go of her guilt, and summon everything in her power to save her daughter.
1471400360
Paper Aeroplanes (Paper Aeroplanes, #1)
paper-aeroplanes
https://images.gr-assets.com/books/1366494917m/17315134.jpg
Dawn O'Porter
It's the mid-1990s, and fifteen year-old Guernsey schoolgirls, Renée and Flo, are not really meant to be friends. Thoughtful, introspective and studious Flo couldn't be more different to ambitious, extroverted and sexually curious Renée. But Renée and Flo are united by loneliness and their dysfunctional families, and an intense bond is formed. Although there are obstacles to their friendship (namely Flo's jealous ex-best friend and Renée's growing infatuation with Flo's brother), fifteen is an age where anything can happen, where life stretches out before you, and when every betrayal feels like the end of the world. For Renée and Flo it is the time of their lives.With graphic content and some scenes of a sexual nature, PAPER AEROPLANES is a gritty, poignant, often laugh-out-loud funny and powerful novel. It is an unforgettable snapshot of small-town adolescence and the heart-stopping power of female friendship.
2013-05-02T00:00:00+00:00
4 years ago
0141043768
What Alice Forgot
what-alice-forgot
https://images.gr-assets.com/books/1377159022m/6469165.jpg
Liane Moriarty
Alice Love is twenty-nine, crazy about her husband, and pregnant with her first child.So imagine Alice’s surprise when she comes to on the floor of a gym and is whisked off to the hospital where she discovers the honeymoon is truly over — she’s getting divorced, she has three kids and she’s actually 39 years old. Alice must reconstruct the events of a lost decade, and find out whether it’s possible to reconstruct her life at the same time. She has to figure out why her sister hardly talks to her, and how is it that she’s become one of those super skinny moms with really expensive clothes. Ultimately, Alice must discover whether forgetting is a blessing or a curse, and whether it’s possible to start over.
0670026603
Me Before You (Me Before You, #1)
me-before-you
https://images.gr-assets.com/books/1357108762m/15507958.jpg
Jojo Moyes
Louisa Clark is an ordinary young woman living an exceedingly ordinary life—steady boyfriend, close family—who has never been farther afield than their tiny village. She takes a badly needed job working for ex-Master of the Universe Will Traynor, who is wheelchair-bound after an accident. Will has always lived a huge life—big deals, extreme sports, worldwide travel—and now he’s pretty sure he cannot live the way he is. Will is acerbic, moody, bossy—but Lou refuses to treat him with kid gloves, and soon his happiness means more to her than she expected. When she learns that Will has shocking plans of his own, she sets out to show him that life is still worth living.A love story for this generation, Me Before You brings to life two people who couldn’t have less in common—a heartbreakingly romantic novel that asks, What do you do when making the person you love happy also means breaking your own heart?
2012-12-31T00:00:00+00:00
4 years ago
161984625X
The Color Project
the-color-project
https://images.gr-assets.com/books/1495203898m/22892448.jpg
Sierra Abrams
Bernice Aurora Wescott has one thing she doesn't want anyone to know: her name. That is, until Bee meets Levi, the local golden boy who runs a charity organization called The Color Project.Levi is not at all shy about attempting to guess Bee’s real name; his persistence is one of the many reasons why Bee falls for him. But while Levi is everything she never knew she needed, giving up her name would feel like a stamp on forever. And that terrifies her.When unexpected news of an illness in the family drains Bee's summer of everything bright, she is pushed to the breaking point. Losing herself in The Color Project—a world of weddings, funerals, cancer patients, and hopeful families that the charity funds—is no longer enough. Bee must hold up the weight of her family, but to do that, she needs Levi. She’ll have to give up her name and let him in completely or lose the best thing that’s ever happened to her.For fans of Stephanie Perkins and Morgan Matson, THE COLOR PROJECT is a story about the three great loves of life—family, friendship, and romance—and the bonds that withstand tragedy.
2017-08-17T00:00:00+00:00
22 days ago
0385497148
Waterfront: A Walk Around Manhattan
waterfront:-a-walk-around-manhattan
https://s.gr-assets.com/assets/nophoto/book/111x148-bcc042a9c91a29c1d680899eff700a03.png
Phillip Lopate
East Side, West Side, from the Little Red Lighthouse to Battery Park City, the wonders of Manhattan's waterfront are both celebrated and secret -- hidden in plain sight. In his brilliant exploration of this defining yet neglected shoreline, personal essayist Philip Lopate also recovers a part of the city's soul. A native New Yorker, Lopate has embraced Manhattan by walking every inch of its perimeter, telling stories on the way of pirates (Captain Kidd) and power brokers (Robert Moses), the lowly shipworm and Typhoid Mary, public housing in Harlem and the building of the Brooklyn Bridge. He evokes the magic of the once bustling old port from Melville's and Whitman's day to the era of the longshoremen in On the Waterfront, while appraising today's developers and environmental activists, and probing new plans for parks and pleasure domes with river views. Whether escorting us into unfamiliar, hazardous crannies or along a Beaux Arts esplanade, Waterfront is a grand literary ramble and defense of urban life by one of our most perceptive observers.
2005-05-10T00:00:00+00:00
12 years ago
0440211824
The Jade Pagoda
the-jade-pagoda
https://images.gr-assets.com/books/1364502669m/3316191.jpg
Marion Clarke
"His mansion was a maze of mysteries, his past a puzzle too dangerous to probe..."As Lesley Blair decorates the bridal suite of handsome widower Drake Wynfield in anticipation of his pending nuptials, she discovers that something is terribly wrong in his lavish home.
1992-11-01T00:00:00+00:00
24 years ago
0312330871
And Then There Were None
and-then-there-were-none
https://images.gr-assets.com/books/1391120695m/16299.jpg
Agatha Christie
First, there were ten - a curious assortment of strangers summoned as weekend guests to a private island off the coast of Devon. Their host, an eccentric millionaire unknown to all of them, is nowhere to be found. All that the guests have in common is a wicked past they're unwilling to reveal - and a secret that will seal their fate. For each has been marked for murder. One by one they fall prey. Before the weekend is out, there will be none. And only the dead are above suspicion.
0345418263
The Princess Bride
the-princess-bride
https://images.gr-assets.com/books/1327903636m/21787.jpg
William Goldman
What happens when the most beautiful girl in the world marries the handsomest prince of all time and he turns out to be...well...a lot less than the man of her dreams?As a boy, William Goldman claims, he loved to hear his father read the S. Morgenstern classic, The Princess Bride. But as a grown-up he discovered that the boring parts were left out of good old Dad's recitation, and only the "good parts" reached his ears.Now Goldman does Dad one better. He's reconstructed the "Good Parts Version" to delight wise kids and wide-eyed grownups everywhere.What's it about? Fencing. Fighting. True Love. Strong Hate. Harsh Revenge. A Few Giants. Lots of Bad Men. Lots of Good Men. Five or Six Beautiful Women. Beasties Monstrous and Gentle. Some Swell Escapes and Captures. Death, Lies, Truth, Miracles, and a Little Sex.In short, it's about everything.
2003-07-15T00:00:00+00:00
14 years ago
0671027344
The Perks of Being a Wallflower
the-perks-of-being-a-wallflower
https://images.gr-assets.com/books/1167352178m/22628.jpg
Stephen Chbosky
The critically acclaimed debut novel from Stephen Chbosky, Perks follows observant “wallflower” Charlie as he charts a course through the strange world between adolescence and adulthood. First dates, family drama, and new friends. Sex, drugs, and The Rocky Horror Picture Show. Devastating loss, young love, and life on the fringes. Caught between trying to live his life and trying to run from it, Charlie must learn to navigate those wild and poignant roller-coaster days known as growing up.
1999-02-01T00:00:00+00:00
18 years ago
0061120073
A Tree Grows in Brooklyn
a-tree-grows-in-brooklyn
https://images.gr-assets.com/books/1327883484m/14891.jpg
Betty  Smith
The beloved American classic about a young girl's coming-of-age at the turn of the century, Betty Smith's A Tree Grows in Brooklyn is a poignant and moving tale filled with compassion and cruelty, laughter and heartache, crowded with life and people and incident. The story of young, sensitive, and idealistic Francie Nolan and her bittersweet formative years in the slums of Williamsburg has enchanted and inspired millions of readers for more than sixty years. By turns overwhelming, sublime, heartbreaking, and uplifting, the daily experiences of the unforgettable Nolans are raw with honesty and tenderly threaded with family connectedness -- in a work of literary art that brilliantly captures a unique time and place as well as incredibly rich moments of universal experience.
2006-05-30T00:00:00+00:00
11 years ago
OrderedDict([('@nil', 'true')])
Lolita
lolita
https://images.gr-assets.com/books/1377756377m/7604.jpg
Vladimir Nabokov
Humbert Humbert - scholar, aesthete and romantic - has fallen completely and utterly in love with Lolita Haze, his landlady's gum-snapping, silky skinned twelve-year-old daughter. Reluctantly agreeing to marry Mrs Haze just to be close to Lolita, Humbert suffers greatly in the pursuit of romance; but when Lo herself starts looking for attention elsewhere, he will carry her off on a desperate cross-country misadventure, all in the name of Love. Hilarious, flamboyant, heart-breaking and full of ingenious word play, Lolita is an immaculate, unforgettable masterpiece of obsession, delusion and lust.
0312577222
The Nightingale
the-nightingale
https://images.gr-assets.com/books/1451446316m/21853621.jpg
Kristin Hannah
Despite their differences, sisters Vianne and Isabelle have always been close. Younger, bolder Isabelle lives in Paris while Vianne is content with life in the French countryside with her husband Antoine and their daughter. But when the Second World War strikes, Antoine is sent off to fight and Vianne finds herself isolated so Isabelle is sent by their father to help her. As the war progresses, the sisters' relationship and strength are tested. With life changing in unbelievably horrific ways, Vianne and Isabelle will find themselves facing frightening situations and responding in ways they never thought possible as bravery and resistance take different forms in each of their actions.
2015-02-03T00:00:00+00:00
2 years ago
0385539258
A Little Life
a-little-life
https://images.gr-assets.com/books/1446469353m/22822858.jpg
Hanya Yanagihara
When four classmates from a small Massachusetts college move to New York to make their way, they're broke, adrift, and buoyed only by their friendship and ambition. There is kind, handsome Willem, an aspiring actor; JB, a quick-witted, sometimes cruel Brooklyn-born painter seeking entry to the art world; Malcolm, a frustrated architect at a prominent firm; and withdrawn, brilliant, enigmatic Jude, who serves as their center of gravity. Over the decades, their relationships deepen and darken, tinged by addiction, success, and pride. Yet their greatest challenge, each comes to realize, is Jude himself, by midlife a terrifyingly talented litigator yet an increasingly broken man, his mind and body scarred by an unspeakable childhood, and haunted by what he fears is a degree of trauma that he’ll not only be unable to overcome—but that will define his life forever.
2015-03-10T00:00:00+00:00
2 years ago
0007172893
Fludd
fludd
https://s.gr-assets.com/assets/nophoto/book/111x148-bcc042a9c91a29c1d680899eff700a03.png
Hilary Mantel
One dark and stormy night in 1956, a stranger named Fludd mysteriously turns up in the dismal village of Fetherhoughton. He is the curate sent by the bishop to assist Father Angwin-or is he? In the most unlikely of places, a superstitious town that understands little of romance or sentimentality, where bad blood between neighbors is ancient and impenetrable, miracles begin to bloom. No matter how copiously Father Angwin drinks while he confesses his broken faith, the level of the bottle does not drop. Although Fludd does not appear to be eating, the food on his plate disappears. Fludd becomes lover, gravedigger, and savior, transforming his dull office into a golden regency of decision, unashamed sensation, and unprecedented action. Knitting together the miraculous and the mundane, the dreadful and the ludicrous, Fludd is a tale of alchemy and transformation told with astonishing art, insight, humor, and wit.
0062278819
Prisoner of Night and Fog (Prisoner of Night and Fog, #1)
prisoner-of-night-and-fog
https://images.gr-assets.com/books/1395470671m/17668473.jpg
Anne Blankman
In 1930s Munich, danger lurks behind dark corners, and secrets are buried deep within the city. But Gretchen Müller, who grew up in the National Socialist Party under the wing of her "uncle" Dolf, has been shielded from that side of society ever since her father traded his life for Dolf's, and Gretchen is his favorite, his pet.Uncle Dolf is none other than Adolf Hitler. And Gretchen follows his every command.Until she meets a fearless and handsome young Jewish reporter named Daniel Cohen. Gretchen should despise Daniel, yet she can't stop herself from listening to his story: that her father, the adored Nazi martyr, was actually murdered by an unknown comrade. She also can't help the fierce attraction brewing between them, despite everything she's been taught to believe about Jews.As Gretchen investigates the very people she's always considered friends, she must decide where her loyalties lie. Will she choose the safety of her former life as a Nazi darling, or will she dare to dig up the truth—even if it could get her and Daniel killed?From debut author Anne Blankman comes this harrowing and evocative story about an ordinary girl faced with the extraordinary decision to give up everything she's ever believed . . . and to trust her own heart instead.
2014-04-22T00:00:00+00:00
3 years ago
0316405124
Wolf by Wolf (Wolf by Wolf, #1)
wolf-by-wolf
https://images.gr-assets.com/books/1424193184m/24807186.jpg
Ryan Graudin
Her story begins on a train.The year is 1956, and the Axis powers of the Third Reich and Imperial Japan rule. To commemorate their Great Victory, Hitler and Emperor Hirohito host the Axis Tour: an annual motorcycle race across their conjoined continents. The victor is awarded an audience with the highly reclusive Adolf Hitler at the Victor’s Ball in Tokyo.Yael, a former death camp prisoner, has witnessed too much suffering, and the five wolves tattooed on her arm are a constant reminder of the loved ones she lost. The resistance has given Yael one goal: Win the race and kill Hitler. A survivor of painful human experimentation, Yael has the power to skinshift and must complete her mission by impersonating last year’s only female racer, Adele Wolfe. This deception becomes more difficult when Felix, Adele twin’s brother, and Luka, her former love interest, enter the race and watch Yael’s every move.But as Yael grows closer to the other competitors, can she bring herself to be as ruthless as she needs to be to avoid discovery and complete her mission?From the author of The Walled City comes a fast-paced and innovative novel that will leave you breathless.
2015-10-20T00:00:00+00:00
2 years ago
0399171614
The Wrath and the Dawn (The Wrath and the Dawn, #1)
the-wrath-and-the-dawn
https://images.gr-assets.com/books/1417956963m/18798983.jpg
Renee Ahdieh
One Life to One Dawn.In a land ruled by a murderous boy-king, each dawn brings heartache to a new family. Khalid, the eighteen-year-old Caliph of Khorasan, is a monster. Each night he takes a new bride only to have a silk cord wrapped around her throat come morning. When sixteen-year-old Shahrzad's dearest friend falls victim to Khalid, Shahrzad vows vengeance and volunteers to be his next bride. Shahrzad is determined not only to stay alive, but to end the caliph's reign of terror once and for all.Night after night, Shahrzad beguiles Khalid, weaving stories that enchant, ensuring her survival, though she knows each dawn could be her last. But something she never expected begins to happen: Khalid is nothing like what she'd imagined him to be. This monster is a boy with a tormented heart. Incredibly, Shahrzad finds herself falling in love. How is this possible? It's an unforgivable betrayal. Still, Shahrzad has come to understand all is not as it seems in this palace of marble and stone. She resolves to uncover whatever secrets lurk and, despite her love, be ready to take Khalid's life as retribution for the many lives he's stolen. Can their love survive this world of stories and secrets?
2015-05-12T00:00:00+00:00
2 years ago
OrderedDict([('@nil', 'true')])
Perception of Life (Perception, #1)
perception-of-life
https://images.gr-assets.com/books/1493779760m/35054530.jpg
Shandi Boyes
Noah Taylor is on the cusp of stardom with his band 'Rise Up'. Noah's soul has been shattered beyond repair after a series of devastating family events. The last thing that Noah wants is a committed relationship. But Noah soon learns that life doesn't always work out the way you plan. Sometimes life can take you on a small detour. You will never want another book boyfriend after you meet Noah Taylor. This novel is sexy, gritty and a little bit raw. Please note this novel contains adult content, such as swear words and sexual references.
OrderedDict([('@nil', 'true')])
The Forever Broken
the-forever-broken
https://images.gr-assets.com/books/1500827937m/28373129.jpg
Ker Dukey
A BROKEN NOVELLA. Blaydon has been playing with fire when it comes to his best friend’s twin siblings, Quinn and Sofia. Sneaking behind everyone’s back to be with them both, he finds himself struggling to choose between them. But maybe it’s not his choice to make. When Sofia’s troubles become too much for her to bear, will a desperate act force their truths into the light? Some secrets are used to cover even more painful deceits and they are about to cost them all immeasurably.
2016-01-29T00:00:00+00:00
2 years ago
/usr/local/lib/python3.4/dist-packages/bs4/__init__.py:181: UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.

The code that caused this warning is on line 170 of the file /usr/lib/python3.4/runpy.py. To get rid of this warning, change code that looks like this:

 BeautifulSoup(YOUR_MARKUP})

to this:

 BeautifulSoup(YOUR_MARKUP, "lxml")

  markup_type=markup_type))
In [49]:
paraz = {'api-key' : '177948e7f058537298a9a57b29ac0195:11:74111394', 'list': 'e-book-fiction'}
In [50]:
booklis = requests.get('https://api.nytimes.com/svc/books/v3/lists.json', params = paraz)
---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    303         # Trigger any extra validation we need to do.
--> 304         self._validate_conn(conn)
    305 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _validate_conn(self, conn)
    723         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 724             conn.connect()
    725 

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    202         # Add certificate verification
--> 203         conn = self._new_conn()
    204 

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

gaierror: [Errno -2] Name or service not known

During handling of the above exception, another exception occurred:

ProtocolError                             Traceback (most recent call last)
/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    361                     retries=Retry(self.max_retries, read=False),
--> 362                     timeout=timeout
    363                 )

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    558             retries = retries.increment(method, url, error=e,
--> 559                                         _pool=self, _stacktrace=stacktrace)
    560             retries.sleep()

/usr/lib/python3/dist-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    244             if read is False:
--> 245                 raise six.reraise(type(error), error, _stacktrace)
    246             elif read is not None:

/usr/lib/python3/dist-packages/six.py in reraise(tp, value, tb)
    623         if value.__traceback__ is not tb:
--> 624             raise value.with_traceback(tb)
    625         raise value

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    303         # Trigger any extra validation we need to do.
--> 304         self._validate_conn(conn)
    305 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _validate_conn(self, conn)
    723         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 724             conn.connect()
    725 

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    202         # Add certificate verification
--> 203         conn = self._new_conn()
    204 

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

ProtocolError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
<ipython-input-50-a7a60ec7b94e> in <module>()
----> 1 booklis = requests.get('https://api.nytimes.com/svc/books/v3/lists.json', params = paraz)

/usr/lib/python3/dist-packages/requests/api.py in get(url, **kwargs)
     58 
     59     kwargs.setdefault('allow_redirects', True)
---> 60     return request('get', url, **kwargs)
     61 
     62 

/usr/lib/python3/dist-packages/requests/api.py in request(method, url, **kwargs)
     47 
     48     session = sessions.Session()
---> 49     return session.request(method=method, url=url, **kwargs)
     50 
     51 

/usr/lib/python3/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    455         }
    456         send_kwargs.update(settings)
--> 457         resp = self.send(prep, **send_kwargs)
    458 
    459         return resp

/usr/lib/python3/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    567 
    568         # Send the request
--> 569         r = adapter.send(request, **kwargs)
    570 
    571         # Total elapsed time of the request (approximately)

/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    405 
    406         except (ProtocolError, socket.error) as err:
--> 407             raise ConnectionError(err, request=request)
    408 
    409         except MaxRetryError as e:

ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))
In [51]:
bookjs = booklis.json()
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-51-a0b3d435ed16> in <module>()
----> 1 bookjs = booklis.json()

AttributeError: 'list' object has no attribute 'json'
In [52]:
bookjs
Out[52]:
{'copyright': 'Copyright (c) 2017 The New York Times Company.  All Rights Reserved.',
 'last_modified': '2017-03-21T13:38:01-04:00',
 'num_results': 15,
 'results': [{'amazon_product_url': 'https://www.amazon.com/Full-Package-Lauren-Blakely-ebook/dp/B01MT5HMRV?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Lauren Blakely',
     'contributor': 'by Lauren Blakely',
     'contributor_note': '',
     'description': "A man shares a cramped apartment with his friend's fetching sister.",
     'price': 0,
     'primary_isbn10': 'None',
     'primary_isbn13': 'A00B01MT5HMRV',
     'publisher': 'Lauren Blakely',
     'title': 'FULL PACKAGE'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 1,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 1},
  {'amazon_product_url': 'http://www.amazon.com/Guernsey-Literary-Potato-Peel-Society/dp/0385340990?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Mary Ann Shaffer and Annie Barrows',
     'contributor': 'by Mary Ann Shaffer and Annie Barrows',
     'contributor_note': '',
     'description': 'After World War II, a journalist travels to the island of Guernsey to meet residents who resisted the Nazi occupation. Originally published in 2008.',
     'price': 0,
     'primary_isbn10': '0440337976',
     'primary_isbn13': '9780440337973',
     'publisher': 'Dial',
     'title': 'THE GUERNSEY LITERARY AND POTATO PEEL PIE SOCIETY'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0385341008', 'isbn13': '9780385341004'},
    {'isbn10': '0440337976', 'isbn13': '9780440337973'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 2,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 2},
  {'amazon_product_url': 'https://www.amazon.com/Whistler-John-Grisham-ebook/dp/B01C1LUFFK?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'John Grisham',
     'contributor': 'by John Grisham',
     'contributor_note': '',
     'description': 'A whistleblower alerts a Florida investigator to judicial corruption involving the Mob and Indian casinos.',
     'price': 0,
     'primary_isbn10': '0385541201',
     'primary_isbn13': '9780385541206',
     'publisher': 'Doubleday',
     'title': 'THE WHISTLER'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0385541198', 'isbn13': '9780385541190'},
    {'isbn10': '0385541201', 'isbn13': '9780385541206'},
    {'isbn10': '0385541570', 'isbn13': '9780385541572'},
    {'isbn10': '1101967676', 'isbn13': '9781101967676'},
    {'isbn10': '1101967684', 'isbn13': '9781101967683'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 3,
   'rank_last_week': 4,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': 'https://www.nytimes.com/2016/11/06/books/review/john-grisham-whistler.html',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 12},
  {'amazon_product_url': 'https://www.amazon.com/Ring-Fire-Pike-Logan-Thriller/dp/1101984767?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Brad Taylor',
     'contributor': 'by Brad Taylor',
     'contributor_note': '',
     'description': 'Pike Logan, a member of a secret counterterrorist unit called the Taskforce, investigates a Saudi-backed Moroccan terrorist cell.',
     'price': 0,
     'primary_isbn10': 'None',
     'primary_isbn13': '9781101984772',
     'publisher': 'Dutton',
     'title': 'RING OF FIRE'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '1101984767', 'isbn13': '9781101984765'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 4,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 1},
  {'amazon_product_url': 'http://www.amazon.com/Small-Great-Things-Jodi-Picoult-ebook/dp/B01AQNYZ3I?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Jodi Picoult',
     'contributor': 'by Jodi Picoult',
     'contributor_note': '',
     'description': 'A medical crisis entangles a black nurse, a white supremacist father and a white lawyer.',
     'price': 0,
     'primary_isbn10': '034554496X',
     'primary_isbn13': '9780345544964',
     'publisher': 'Ballantine',
     'title': 'SMALL GREAT THINGS'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0345544951', 'isbn13': '9780345544957'},
    {'isbn10': '034554496X', 'isbn13': '9780345544964'},
    {'isbn10': '1410463745', 'isbn13': '9781410463746'},
    {'isbn10': '0425286010', 'isbn13': '9780425286012'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 5,
   'rank_last_week': 7,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': 'https://www.nytimes.com/2016/10/16/books/review/jodi-picoult-small-great-things-roxane-gay.html',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 9},
  {'amazon_product_url': 'https://www.amazon.com/Shelter-Adeline-Badge-Honor-Heroes-ebook/dp/B01MF62CN8?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Susan Stoker',
     'contributor': 'by Susan Stoker',
     'contributor_note': '',
     'description': 'A fireman must keep his overprotective nature in check while pursuing an epileptic prey to her suspiciously obsessive boss.',
     'price': 0,
     'primary_isbn10': 'None',
     'primary_isbn13': 'A00B01MF62CN8',
     'publisher': 'Stoker Aces Production',
     'title': 'SHELTER FOR ADELINE'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 6,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 1},
  {'amazon_product_url': 'http://www.amazon.com/Man-Called-Ove-Novel/dp/1476738025?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Fredrik Backman',
     'contributor': 'by Fredrik Backman',
     'contributor_note': '',
     'description': 'An angry old curmudgeon gets new neighbors, and things are about to change for all of them.',
     'price': 0,
     'primary_isbn10': 'None',
     'primary_isbn13': '9781476738031',
     'publisher': 'Atria',
     'title': 'A MAN CALLED OVE'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '1476738025', 'isbn13': '9781476738024'},
    {'isbn10': '1476738017', 'isbn13': '9781476738017'},
    {'isbn10': '1594139830', 'isbn13': '9781594139833'},
    {'isbn10': '1410472922', 'isbn13': '9781410472922'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 7,
   'rank_last_week': 9,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 3},
  {'amazon_product_url': 'https://www.amazon.com/Mistress-Novel-Danielle-Steel-ebook/dp/B01E2GZ5FC?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Danielle Steel',
     'contributor': 'by Danielle Steel',
     'contributor_note': '',
     'description': 'The beautiful mistress of a Russian oligarch falls in love with an artist and yearns for freedom.',
     'price': 0,
     'primary_isbn10': '0425285359',
     'primary_isbn13': '9780425285350',
     'publisher': 'Delacorte',
     'title': 'THE MISTRESS'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0345531116', 'isbn13': '9780345531117'},
    {'isbn10': '0425285359', 'isbn13': '9780425285350'},
    {'isbn10': '0735210039', 'isbn13': '9780735210035'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 8,
   'rank_last_week': 2,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 2},
  {'amazon_product_url': 'https://www.amazon.com/Guests-South-Battery-Tradd-Street/dp/0451475232?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Karen White',
     'contributor': 'by Karen White',
     'contributor_note': '',
     'description': 'Spirits invade the life of a Charleston realtor.',
     'price': 0,
     'primary_isbn10': '',
     'primary_isbn13': '9780698193000',
     'publisher': 'Berkley',
     'title': 'THE GUESTS ON SOUTH BATTERY'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0451475232', 'isbn13': '9780451475237'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 9,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 1},
  {'amazon_product_url': 'https://www.amazon.com/No-Mans-Land-John-Puller/dp/145558651X?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'David Baldacci',
     'contributor': 'by David Baldacci',
     'contributor_note': '',
     'description': 'John Puller, a special agent with the Army, searches for the truth about his mother, who disappeared 30 years ago.',
     'price': 0,
     'primary_isbn10': '1455586498',
     'primary_isbn13': '9781455586493',
     'publisher': 'Grand Central',
     'title': "NO MAN'S LAND"}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '145558651X', 'isbn13': '9781455586516'},
    {'isbn10': '1455541664', 'isbn13': '9781455541669'},
    {'isbn10': '1455586498', 'isbn13': '9781455586493'},
    {'isbn10': '1455586501', 'isbn13': '9781455586509'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 10,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 8},
  {'amazon_product_url': 'https://www.amazon.com/Below-Belt-Stone-Barrington-Novel/dp/0399573976?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Stuart Woods',
     'contributor': 'by Stuart Woods',
     'contributor_note': '',
     'description': 'The New York lawyer Stone Barrington faces danger when he finds himself in possession of a retired C.I.A. agent’s explosive memoir.',
     'price': 0,
     'primary_isbn10': 'None',
     'primary_isbn13': '9780399574184',
     'publisher': 'Putnam',
     'title': 'BELOW THE BELT'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0399573976', 'isbn13': '9780399573972'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 11,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 0},
  {'amazon_product_url': 'https://www.amazon.com/Cross-Line-James-Patterson-ebook/dp/B01C37XEUU?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'James Patterson',
     'contributor': 'by James Patterson',
     'contributor_note': '',
     'description': 'Detective Alex Cross and his wife, Bree, team up to catch a killer causing chaos in Washington, D.C.',
     'price': 0,
     'primary_isbn10': '031640716X',
     'primary_isbn13': '9780316407168',
     'publisher': 'Little, Brown',
     'title': 'CROSS THE LINE'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0316407097', 'isbn13': '9780316407090'},
    {'isbn10': '031640716X', 'isbn13': '9780316407168'},
    {'isbn10': '0316407151', 'isbn13': '9780316407151'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 12,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 0},
  {'amazon_product_url': 'https://www.amazon.com/Dry-Novel-Jane-Harper-ebook/dp/B01BSN15F6?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Jane Harper',
     'contributor': 'by Jane Harper',
     'contributor_note': '',
     'description': '',
     'price': 0,
     'primary_isbn10': '1250105617',
     'primary_isbn13': '9781250105615',
     'publisher': 'Flatiron',
     'title': 'THE DRY'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '1250105609', 'isbn13': '9781250105608'},
    {'isbn10': '1250105617', 'isbn13': '9781250105615'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 13,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 0},
  {'amazon_product_url': 'https://www.amazon.com/Dogs-Purpose-Novel-Humans/dp/0765330342?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'W Bruce Cameron',
     'contributor': 'by W. Bruce Cameron',
     'contributor_note': '',
     'description': 'A canine narrator undergoes a series of reincarnations.',
     'price': 0,
     'primary_isbn10': '1429960272',
     'primary_isbn13': '9781429960274',
     'publisher': 'Forge',
     'title': "A DOG'S PURPOSE"}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '0765326264', 'isbn13': '9780765326263'},
    {'isbn10': '0765330342', 'isbn13': '9780765330345'},
    {'isbn10': '0765388111', 'isbn13': '9780765388117'},
    {'isbn10': '0765388103', 'isbn13': '9780765388100'},
    {'isbn10': '1429960272', 'isbn13': '9781429960274'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 14,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 0},
  {'amazon_product_url': 'https://www.amazon.com/Sleepwalker-Novel-Chris-Bohjalian-ebook/dp/B01FPGY5TK?tag=NYTBS-20',
   'asterisk': 0,
   'bestsellers_date': '2017-01-14',
   'book_details': [{'age_group': '',
     'author': 'Chris Bohjalian',
     'contributor': 'by Chris Bohjalian',
     'contributor_note': '',
     'description': 'The daughters of a Vermont woman who disappeared from her home in the middle of the night try to understand what happened.',
     'price': 0,
     'primary_isbn10': '0385538928',
     'primary_isbn13': '9780385538923',
     'publisher': 'Doubleday',
     'title': 'THE SLEEPWALKER'}],
   'dagger': 0,
   'display_name': 'E-Book Fiction',
   'isbns': [{'isbn10': '038553891X', 'isbn13': '9780385538916'},
    {'isbn10': '0385538928', 'isbn13': '9780385538923'}],
   'list_name': 'E-Book Fiction',
   'published_date': '2017-01-29',
   'rank': 15,
   'rank_last_week': 0,
   'reviews': [{'article_chapter_link': '',
     'book_review_link': '',
     'first_chapter_link': '',
     'sunday_review_link': ''}],
   'weeks_on_list': 0}],
 'status': 'OK'}
In [53]:
booklis = list()
In [54]:
for bores in bookjs['results']:
    #for bor in bores:
    print(bores['book_details'])
    booklis.append(bores['book_details'])
    #print(bores['book_details'][0]['title'])
    #    print(bor)
    #print(bores['book_details'][bor]['author'])
[{'age_group': '', 'price': 0, 'publisher': 'Lauren Blakely', 'primary_isbn13': 'A00B01MT5HMRV', 'author': 'Lauren Blakely', 'contributor': 'by Lauren Blakely', 'title': 'FULL PACKAGE', 'contributor_note': '', 'description': "A man shares a cramped apartment with his friend's fetching sister.", 'primary_isbn10': 'None'}]
[{'age_group': '', 'price': 0, 'publisher': 'Dial', 'primary_isbn13': '9780440337973', 'author': 'Mary Ann Shaffer and Annie Barrows', 'contributor': 'by Mary Ann Shaffer and Annie Barrows', 'title': 'THE GUERNSEY LITERARY AND POTATO PEEL PIE SOCIETY', 'contributor_note': '', 'description': 'After World War II, a journalist travels to the island of Guernsey to meet residents who resisted the Nazi occupation. Originally published in 2008.', 'primary_isbn10': '0440337976'}]
[{'age_group': '', 'price': 0, 'publisher': 'Doubleday', 'primary_isbn13': '9780385541206', 'author': 'John Grisham', 'contributor': 'by John Grisham', 'title': 'THE WHISTLER', 'contributor_note': '', 'description': 'A whistleblower alerts a Florida investigator to judicial corruption involving the Mob and Indian casinos.', 'primary_isbn10': '0385541201'}]
[{'age_group': '', 'price': 0, 'publisher': 'Dutton', 'primary_isbn13': '9781101984772', 'author': 'Brad Taylor', 'contributor': 'by Brad Taylor', 'title': 'RING OF FIRE', 'contributor_note': '', 'description': 'Pike Logan, a member of a secret counterterrorist unit called the Taskforce, investigates a Saudi-backed Moroccan terrorist cell.', 'primary_isbn10': 'None'}]
[{'age_group': '', 'price': 0, 'publisher': 'Ballantine', 'primary_isbn13': '9780345544964', 'author': 'Jodi Picoult', 'contributor': 'by Jodi Picoult', 'title': 'SMALL GREAT THINGS', 'contributor_note': '', 'description': 'A medical crisis entangles a black nurse, a white supremacist father and a white lawyer.', 'primary_isbn10': '034554496X'}]
[{'age_group': '', 'price': 0, 'publisher': 'Stoker Aces Production', 'primary_isbn13': 'A00B01MF62CN8', 'author': 'Susan Stoker', 'contributor': 'by Susan Stoker', 'title': 'SHELTER FOR ADELINE', 'contributor_note': '', 'description': 'A fireman must keep his overprotective nature in check while pursuing an epileptic prey to her suspiciously obsessive boss.', 'primary_isbn10': 'None'}]
[{'age_group': '', 'price': 0, 'publisher': 'Atria', 'primary_isbn13': '9781476738031', 'author': 'Fredrik Backman', 'contributor': 'by Fredrik Backman', 'title': 'A MAN CALLED OVE', 'contributor_note': '', 'description': 'An angry old curmudgeon gets new neighbors, and things are about to change for all of them.', 'primary_isbn10': 'None'}]
[{'age_group': '', 'price': 0, 'publisher': 'Delacorte', 'primary_isbn13': '9780425285350', 'author': 'Danielle Steel', 'contributor': 'by Danielle Steel', 'title': 'THE MISTRESS', 'contributor_note': '', 'description': 'The beautiful mistress of a Russian oligarch falls in love with an artist and yearns for freedom.', 'primary_isbn10': '0425285359'}]
[{'age_group': '', 'price': 0, 'publisher': 'Berkley', 'primary_isbn13': '9780698193000', 'author': 'Karen White', 'contributor': 'by Karen White', 'title': 'THE GUESTS ON SOUTH BATTERY', 'contributor_note': '', 'description': 'Spirits invade the life of a Charleston realtor.', 'primary_isbn10': ''}]
[{'age_group': '', 'price': 0, 'publisher': 'Grand Central', 'primary_isbn13': '9781455586493', 'author': 'David Baldacci', 'contributor': 'by David Baldacci', 'title': "NO MAN'S LAND", 'contributor_note': '', 'description': 'John Puller, a special agent with the Army, searches for the truth about his mother, who disappeared 30 years ago.', 'primary_isbn10': '1455586498'}]
[{'age_group': '', 'price': 0, 'publisher': 'Putnam', 'primary_isbn13': '9780399574184', 'author': 'Stuart Woods', 'contributor': 'by Stuart Woods', 'title': 'BELOW THE BELT', 'contributor_note': '', 'description': 'The New York lawyer Stone Barrington faces danger when he finds himself in possession of a retired C.I.A. agent’s explosive memoir.', 'primary_isbn10': 'None'}]
[{'age_group': '', 'price': 0, 'publisher': 'Little, Brown', 'primary_isbn13': '9780316407168', 'author': 'James Patterson', 'contributor': 'by James Patterson', 'title': 'CROSS THE LINE', 'contributor_note': '', 'description': 'Detective Alex Cross and his wife, Bree, team up to catch a killer causing chaos in Washington, D.C.', 'primary_isbn10': '031640716X'}]
[{'age_group': '', 'price': 0, 'publisher': 'Flatiron', 'primary_isbn13': '9781250105615', 'author': 'Jane Harper', 'contributor': 'by Jane Harper', 'title': 'THE DRY', 'contributor_note': '', 'description': '', 'primary_isbn10': '1250105617'}]
[{'age_group': '', 'price': 0, 'publisher': 'Forge', 'primary_isbn13': '9781429960274', 'author': 'W Bruce Cameron', 'contributor': 'by W. Bruce Cameron', 'title': "A DOG'S PURPOSE", 'contributor_note': '', 'description': 'A canine narrator undergoes a series of reincarnations.', 'primary_isbn10': '1429960272'}]
[{'age_group': '', 'price': 0, 'publisher': 'Doubleday', 'primary_isbn13': '9780385538923', 'author': 'Chris Bohjalian', 'contributor': 'by Chris Bohjalian', 'title': 'THE SLEEPWALKER', 'contributor_note': '', 'description': 'The daughters of a Vermont woman who disappeared from her home in the middle of the night try to understand what happened.', 'primary_isbn10': '0385538928'}]

Get Goodreads book IDs given ISBNs Get Goodreads book IDs given one or more ISBNs. Response contains IDs without any markup. URL: https://www.goodreads.com/book/isbn_to_id (sample url) HTTP method: GET Parameters:

key: Developer key (required).

isbn: ISBNs of books to look up.


Get Goodreads work IDs g

In [55]:
para = {'key' : 'j6a7NN6aLyIGFrt9YHwibw', 'v': 2}
In [56]:
newsnip = dict()
In [57]:
for bok in booklis:
    #print(bok)
    for bo in bok:
        #print(bo)
        print(bo)
        botit = (bo['title'])
        bonow = botit.replace(' ', '-')
        gifparm = {'q' : botit, 'api_key' : 'dc6zaTOxFJmzC'}
        reqgif = requests.get('http://api.giphy.com/v1/gifs/search', params = gifparm)
        gifjs = (reqgif.json())
        print(gifjs['data'][0]['images']['fixed_width'])#['images'])
        print(botit.capitalize())
        print(bo['description'])
        print(bo['author'])
        print(bo['publisher'])
        newdic = {'api-key' : '177948e7f058537298a9a57b29ac0195:11:74111394', 'q' : bonow, 'sort' : 'newest'}
        newnews = requests.get('https://api.nytimes.com/svc/search/v2/articlesearch.json', params = newdic)
        newsjs = (newnews.json())
        newdocs = newsjs['response']['docs']
        
        for newd in newdocs:
            print(newd)
            print(newd['headline']['print_headline'])
            newsnip.update({botit : dict({'printhead' : newd['headline']['print_headline'], 'snippet' : newd['snippet']})})
            print(newd['snippet'])

        #print(bo['primary_isbn13'])
        #print(bo['primary_isbn10'])
        #paraisb = {'key' : 'j6a7NN6aLyIGFrt9YHwibw', 'isbn': bo['primary_isbn13']}
        #reqgdr = requests.get('https://www.goodreads.com/book/isbn_to_id', params = paraisb)
        #print(reqgdr.text)
        #reqb = {'key' : 'j6a7NN6aLyIGFrt9YHwibw', 'q' : (bo['title'])}
        #reqsbib = requests.get('https://www.goodreads.com/search/index.xml', params = reqb)
        #print(reqsbib.text)
{'age_group': '', 'price': 0, 'publisher': 'Lauren Blakely', 'primary_isbn13': 'A00B01MT5HMRV', 'author': 'Lauren Blakely', 'contributor': 'by Lauren Blakely', 'title': 'FULL PACKAGE', 'contributor_note': '', 'description': "A man shares a cramped apartment with his friend's fetching sister.", 'primary_isbn10': 'None'}
---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    307         # urllib3.request. It also calls makefile (recv) on the socket.
--> 308         conn.request(method, url, **httplib_request_kw)
    309 

/usr/lib/python3.4/http/client.py in request(self, method, url, body, headers)
   1089         """Send a complete request to the server."""
-> 1090         self._send_request(method, url, body, headers)
   1091 

/usr/lib/python3.4/http/client.py in _send_request(self, method, url, body, headers)
   1127             body = body.encode('iso-8859-1')
-> 1128         self.endheaders(body)
   1129 

/usr/lib/python3.4/http/client.py in endheaders(self, message_body)
   1085             raise CannotSendHeader()
-> 1086         self._send_output(message_body)
   1087 

/usr/lib/python3.4/http/client.py in _send_output(self, message_body)
    923             message_body = None
--> 924         self.send(msg)
    925         if message_body is not None:

/usr/lib/python3.4/http/client.py in send(self, data)
    858             if self.auto_open:
--> 859                 self.connect()
    860             else:

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    153     def connect(self):
--> 154         conn = self._new_conn()
    155         self._prepare_conn(conn)

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

gaierror: [Errno -2] Name or service not known

During handling of the above exception, another exception occurred:

ProtocolError                             Traceback (most recent call last)
/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    361                     retries=Retry(self.max_retries, read=False),
--> 362                     timeout=timeout
    363                 )

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    558             retries = retries.increment(method, url, error=e,
--> 559                                         _pool=self, _stacktrace=stacktrace)
    560             retries.sleep()

/usr/lib/python3/dist-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    244             if read is False:
--> 245                 raise six.reraise(type(error), error, _stacktrace)
    246             elif read is not None:

/usr/lib/python3/dist-packages/six.py in reraise(tp, value, tb)
    623         if value.__traceback__ is not tb:
--> 624             raise value.with_traceback(tb)
    625         raise value

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    307         # urllib3.request. It also calls makefile (recv) on the socket.
--> 308         conn.request(method, url, **httplib_request_kw)
    309 

/usr/lib/python3.4/http/client.py in request(self, method, url, body, headers)
   1089         """Send a complete request to the server."""
-> 1090         self._send_request(method, url, body, headers)
   1091 

/usr/lib/python3.4/http/client.py in _send_request(self, method, url, body, headers)
   1127             body = body.encode('iso-8859-1')
-> 1128         self.endheaders(body)
   1129 

/usr/lib/python3.4/http/client.py in endheaders(self, message_body)
   1085             raise CannotSendHeader()
-> 1086         self._send_output(message_body)
   1087 

/usr/lib/python3.4/http/client.py in _send_output(self, message_body)
    923             message_body = None
--> 924         self.send(msg)
    925         if message_body is not None:

/usr/lib/python3.4/http/client.py in send(self, data)
    858             if self.auto_open:
--> 859                 self.connect()
    860             else:

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    153     def connect(self):
--> 154         conn = self._new_conn()
    155         self._prepare_conn(conn)

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

ProtocolError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
<ipython-input-57-1b46a5c3b9ff> in <module>()
      7         bonow = botit.replace(' ', '-')
      8         gifparm = {'q' : botit, 'api_key' : 'dc6zaTOxFJmzC'}
----> 9         reqgif = requests.get('http://api.giphy.com/v1/gifs/search', params = gifparm)
     10         gifjs = (reqgif.json())
     11         print(gifjs['data'][0]['images']['fixed_width'])#['images'])

/usr/lib/python3/dist-packages/requests/api.py in get(url, **kwargs)
     58 
     59     kwargs.setdefault('allow_redirects', True)
---> 60     return request('get', url, **kwargs)
     61 
     62 

/usr/lib/python3/dist-packages/requests/api.py in request(method, url, **kwargs)
     47 
     48     session = sessions.Session()
---> 49     return session.request(method=method, url=url, **kwargs)
     50 
     51 

/usr/lib/python3/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    455         }
    456         send_kwargs.update(settings)
--> 457         resp = self.send(prep, **send_kwargs)
    458 
    459         return resp

/usr/lib/python3/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    567 
    568         # Send the request
--> 569         r = adapter.send(request, **kwargs)
    570 
    571         # Total elapsed time of the request (approximately)

/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    405 
    406         except (ProtocolError, socket.error) as err:
--> 407             raise ConnectionError(err, request=request)
    408 
    409         except MaxRetryError as e:

ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))
In [58]:
newdic = {'api-key' : '177948e7f058537298a9a57b29ac0195:11:74111394', 'q' : 'blockchain', 'sort' : 'newest'}
In [59]:
newnews = requests.get('https://api.nytimes.com/svc/search/v2/articlesearch.json', params = newdic)
---------------------------------------------------------------------------
gaierror                                  Traceback (most recent call last)
/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    303         # Trigger any extra validation we need to do.
--> 304         self._validate_conn(conn)
    305 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _validate_conn(self, conn)
    723         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 724             conn.connect()
    725 

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    202         # Add certificate verification
--> 203         conn = self._new_conn()
    204 

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

gaierror: [Errno -2] Name or service not known

During handling of the above exception, another exception occurred:

ProtocolError                             Traceback (most recent call last)
/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    361                     retries=Retry(self.max_retries, read=False),
--> 362                     timeout=timeout
    363                 )

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    558             retries = retries.increment(method, url, error=e,
--> 559                                         _pool=self, _stacktrace=stacktrace)
    560             retries.sleep()

/usr/lib/python3/dist-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    244             if read is False:
--> 245                 raise six.reraise(type(error), error, _stacktrace)
    246             elif read is not None:

/usr/lib/python3/dist-packages/six.py in reraise(tp, value, tb)
    623         if value.__traceback__ is not tb:
--> 624             raise value.with_traceback(tb)
    625         raise value

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
    515                                                   timeout=timeout,
--> 516                                                   body=body, headers=headers)
    517 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
    303         # Trigger any extra validation we need to do.
--> 304         self._validate_conn(conn)
    305 

/usr/lib/python3/dist-packages/urllib3/connectionpool.py in _validate_conn(self, conn)
    723         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 724             conn.connect()
    725 

/usr/lib/python3/dist-packages/urllib3/connection.py in connect(self)
    202         # Add certificate verification
--> 203         conn = self._new_conn()
    204 

/usr/lib/python3/dist-packages/urllib3/connection.py in _new_conn(self)
    132             conn = connection.create_connection(
--> 133                 (self.host, self.port), self.timeout, **extra_kw)
    134 

/usr/lib/python3/dist-packages/urllib3/util/connection.py in create_connection(address, timeout, source_address, socket_options)
     63     err = None
---> 64     for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
     65         af, socktype, proto, canonname, sa = res

/usr/lib/python3.4/socket.py in getaddrinfo(host, port, family, type, proto, flags)
    529     addrlist = []
--> 530     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
    531         af, socktype, proto, canonname, sa = res

ProtocolError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
<ipython-input-59-f82bdb8f7fed> in <module>()
----> 1 newnews = requests.get('https://api.nytimes.com/svc/search/v2/articlesearch.json', params = newdic)

/usr/lib/python3/dist-packages/requests/api.py in get(url, **kwargs)
     58 
     59     kwargs.setdefault('allow_redirects', True)
---> 60     return request('get', url, **kwargs)
     61 
     62 

/usr/lib/python3/dist-packages/requests/api.py in request(method, url, **kwargs)
     47 
     48     session = sessions.Session()
---> 49     return session.request(method=method, url=url, **kwargs)
     50 
     51 

/usr/lib/python3/dist-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    455         }
    456         send_kwargs.update(settings)
--> 457         resp = self.send(prep, **send_kwargs)
    458 
    459         return resp

/usr/lib/python3/dist-packages/requests/sessions.py in send(self, request, **kwargs)
    567 
    568         # Send the request
--> 569         r = adapter.send(request, **kwargs)
    570 
    571         # Total elapsed time of the request (approximately)

/usr/lib/python3/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    405 
    406         except (ProtocolError, socket.error) as err:
--> 407             raise ConnectionError(err, request=request)
    408 
    409         except MaxRetryError as e:

ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))
In [26]:
newsjs = (newnews.json())
In [27]:
newdocs = newsjs['response']['docs']
In [31]:
for newd in newdocs:
    print(newd)
    phead = (newd['headline']['print_headline'])
    pnone = phead.replace(' ', '-')
    print(newd['snippet'])
    newsnip.update({pnone : dict({'printheadlin' : phead, 'snippit' : newd['snippet']})})
{'score': 3.716621, 'document_type': 'article', 'uri': 'nyt://article/f46bbee3-ddf1-5fa3-bad2-0844fb2ad347', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 250, 'web_url': 'https://www.nytimes.com/reuters/2017/09/06/technology/06reuters-ukraine-blockchain.html', 'snippet': "Ukraine's justice ministry carried out trial auctions using blockchain technology for the first time on Wednesday, part of an effort to improve transparency in government transactions.", 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Ukrainian Ministry Carries Out First Blockchain Transactions', 'main': 'Ukrainian Ministry Carries Out First Blockchain Transactions'}, '_id': '59b03c007c459f246b62214a', 'pub_date': '2017-09-06T18:18:28+0000', 'keywords': []}
Ukraine's justice ministry carried out trial auctions using blockchain technology for the first time on Wednesday, part of an effort to improve transparency in government transactions.
{'score': 3.2532094, 'document_type': 'article', 'uri': 'nyt://article/13658f66-ef4d-507e-9a82-1c533c813856', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 288, 'web_url': 'https://www.nytimes.com/reuters/2017/09/06/business/06reuters-blockchain-ecb.html', 'snippet': "Distributed ledger technology like blockchain is not mature enough to power the world's biggest payment systems, though it has the potential to improve system resilience, the European Central Bank and the Bank of Japan said on Wednesday.", 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Blockchain Immature for Big Central Banks, ECB and BOJ Say', 'main': 'Blockchain Immature for Big Central Banks, ECB and BOJ Say'}, '_id': '59b006977c459f246b62206f', 'pub_date': '2017-09-06T14:30:32+0000', 'keywords': []}
Distributed ledger technology like blockchain is not mature enough to power the world's biggest payment systems, though it has the potential to improve system resilience, the European Central Bank and the Bank of Japan said on Wednesday.
{'score': 0.6811652, 'document_type': 'article', 'uri': 'nyt://article/496d1ecb-7808-55a7-a8fd-de68e0496fce', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 390, 'web_url': 'https://www.nytimes.com/reuters/2017/09/05/business/05reuters-sec-enforcement.html', 'snippet': 'Regulators must do more to help mom-and-pop investors better understand the potential risks posed by cyber crime and new technologies used to commit fraud, U.S. Securities and Exchange Commission Chairman Jay Clayton said on Tuesday.', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'SEC Chief Says Cyber Crime Risks Are Substantial, Systemic', 'main': 'SEC Chief Says Cyber Crime Risks Are Substantial, Systemic'}, '_id': '59af58e47c459f246b621ea0', 'pub_date': '2017-09-06T02:09:35+0000', 'keywords': []}
Regulators must do more to help mom-and-pop investors better understand the potential risks posed by cyber crime and new technologies used to commit fraud, U.S. Securities and Exchange Commission Chairman Jay Clayton said on Tuesday.
{'score': 2.4777472, 'document_type': 'article', 'uri': 'nyt://article/b11f50cc-2a60-5f57-bde8-2f96ff7789ea', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 395, 'web_url': 'https://www.nytimes.com/reuters/2017/09/05/business/05reuters-blockchain-insurance-marine.html', 'snippet': 'Consultancy EY, data security firm Guardtime, Microsoft and ship operator Maersk have joined to build a blockchain-based marine insurance platform that will be the first real-world use of the nascent technology in the shipping industry. ', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'EY Teams Up With Maersk, Microsoft on Blockchain-Based Marine Insurance', 'main': 'EY Teams Up With Maersk, Microsoft on Blockchain-Based Marine Insurance'}, '_id': '59af2d497c459f246b621e55', 'pub_date': '2017-09-05T23:03:31+0000', 'keywords': []}
Consultancy EY, data security firm Guardtime, Microsoft and ship operator Maersk have joined to build a blockchain-based marine insurance platform that will be the first real-world use of the nascent technology in the shipping industry. 
{'score': 0.9414933, 'document_type': 'article', 'uri': 'nyt://article/a6d81fb9-0662-547a-a3ee-444995be4008', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 259, 'web_url': 'https://www.nytimes.com/reuters/2017/09/05/business/05reuters-china-finance-digital-regulation.html', 'snippet': "It is the interest of the long-term development of blockchain technologies for the rapidly growing market for fundraising through the issue of digital currencies to be regulated, an adviser to China's central bank said on Tuesday.", 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Regulation of Digital Coin Offerings Needed for Healthy Market: China Central Bank Adviser', 'main': 'Regulation of Digital Coin Offerings Needed for Healthy Market: China Central Bank Adviser'}, '_id': '59aed56f7c459f246b621d51', 'pub_date': '2017-09-05T16:48:35+0000', 'keywords': []}
It is the interest of the long-term development of blockchain technologies for the rapidly growing market for fundraising through the issue of digital currencies to be regulated, an adviser to China's central bank said on Tuesday.
{'score': 0.96538925, 'document_type': 'article', 'uri': 'nyt://article/73e895cf-4c16-5dd1-9f00-f81b4be0e3b6', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 259, 'web_url': 'https://www.nytimes.com/reuters/2017/09/05/technology/05reuters-china-finance-digital.html', 'snippet': "It is the interest of the long-term development of blockchain technologies for the rapidly growing market for fundraising through the issue of digital currencies to be regulated, an adviser to China's central bank said on Tuesday.", 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Regulation of Digital Coin Offerings Needed for Healthy Market: China Central Bank Adviser', 'main': 'Regulation of Digital Coin Offerings Needed for Healthy Market: China Central Bank Adviser'}, '_id': '59ae48c47c459f246b621bd1', 'pub_date': '2017-09-05T06:48:32+0000', 'keywords': []}
It is the interest of the long-term development of blockchain technologies for the rapidly growing market for fundraising through the issue of digital currencies to be regulated, an adviser to China's central bank said on Tuesday.
{'score': 3.2131865, 'document_type': 'article', 'uri': 'nyt://article/6b030a69-6596-5030-8bc5-93efee1e926d', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 246, 'web_url': 'https://www.nytimes.com/reuters/2017/08/31/business/31reuters-blockchain-banks.html', 'snippet': 'Six new banks have joined a UBS-led effort to create a digital cash system that would allow financial markets to make payments and settle transactions quickly via blockchain technology.', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Six Big Banks Join Blockchain Digital Cash Settlement Project', 'main': 'Six Big Banks Join Blockchain Digital Cash Settlement Project'}, '_id': '59a7c7c37c459f246b620fc0', 'pub_date': '2017-08-31T08:24:27+0000', 'keywords': []}
Six new banks have joined a UBS-led effort to create a digital cash system that would allow financial markets to make payments and settle transactions quickly via blockchain technology.
{'score': 0.95841837, 'document_type': 'article', 'uri': 'nyt://article/20f049a2-ad3b-5bac-819e-fe75f2a9701a', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 392, 'web_url': 'https://www.nytimes.com/reuters/2017/08/30/business/30reuters-kik-blockchain-offering.html', 'snippet': 'Kik Interactive, the Ontario, Canada-based creator of the global chat platform Kik, valued at $1 billion, said on Tuesday it will launch in two weeks the sale of its own crypto currency which is expected to raise $125 million. ', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Canada-Based Chat Platform Kik to Launch $125 Million Token Sale', 'main': 'Canada-Based Chat Platform Kik to Launch $125 Million Token Sale'}, '_id': '59a73eb37c459f246b620ec4', 'pub_date': '2017-08-30T22:39:35+0000', 'keywords': []}
Kik Interactive, the Ontario, Canada-based creator of the global chat platform Kik, valued at $1 billion, said on Tuesday it will launch in two weeks the sale of its own crypto currency which is expected to raise $125 million. 
{'score': 3.3641672, 'document_type': 'article', 'uri': 'nyt://article/27ab544e-9240-5f08-a4ce-a8fa8951fdb7', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 358, 'web_url': 'https://www.nytimes.com/reuters/2017/08/30/technology/30reuters-autos-blockchain.html', 'snippet': 'The technology underpinning the cryptocurrency bitcoin is migrating to the auto industry and vehicle sharing.', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Blockchain Technology Moves Into Car Sharing, Mobility Services', 'main': 'Blockchain Technology Moves Into Car Sharing, Mobility Services'}, '_id': '59a6beb77c459f246b620d2e', 'pub_date': '2017-08-30T13:33:28+0000', 'keywords': []}
The technology underpinning the cryptocurrency bitcoin is migrating to the auto industry and vehicle sharing.
{'score': 0.95841837, 'document_type': 'article', 'uri': 'nyt://article/50c2e233-04a6-5933-b014-7aad3b43ae7f', 'type_of_material': 'News', 'new_desk': 'None', 'multimedia': [], 'word_count': 392, 'web_url': 'https://www.nytimes.com/reuters/2017/08/29/business/29reuters-kik-blockchain-offering.html', 'snippet': 'Kik Interactive, the Ontario, Canada-based creator of the global chat platform Kik, valued at $1 billion, said on Tuesday it will launch in two weeks the sale of its own crypto currency which is expected to raise $125 million. ', 'blog': {}, 'source': 'Reuters', 'byline': {'original': 'By REUTERS'}, 'headline': {'print_headline': 'Canada-Based Chat Platform Kik to Launch $125 Million Token Sale', 'main': 'Canada-Based Chat Platform Kik to Launch $125 Million Token Sale'}, '_id': '59a5d3de7c459f246b620b45', 'pub_date': '2017-08-29T20:51:30+0000', 'keywords': []}
Kik Interactive, the Ontario, Canada-based creator of the global chat platform Kik, valued at $1 billion, said on Tuesday it will launch in two weeks the sale of its own crypto currency which is expected to raise $125 million. 
In [39]:
for news in newsnip.keys():
    print(news)
Canada-Bas
Blockchain-Technology-Moves-Into-Car-Sharing,-Mobility-Services
Six-Big-Banks-Join-Blockchain-Digital-Cash-Settlement-Project
Blockchain
Regulation
Regulation-of-Digital-Coin-Offerings-Needed-for-Healthy-Market:-China-Central-Bank-Adviser
SEC-Chief-
FULL PACKAGE
THE GUERNSEY LITERARY AND POTATO PEEL PIE SOCIETY
EY-Teams-Up-With-Maersk,-Microsoft-on-Blockchain-Based-Marine-Insurance
Ukrainian-
Ukrainian-Ministry-Carries-Out-First-Blockchain-Transactions
EY-Teams-U
Canada-Based-Chat-Platform-Kik-to-Launch-$125-Million-Token-Sale
Six-Big-Ba
SEC-Chief-Says-Cyber-Crime-Risks-Are-Substantial,-Systemic
Blockchain-Immature-for-Big-Central-Banks,-ECB-and-BOJ-Say
In [ ]:
 
In [ ]:
 

redTube

RedTube json Python

In [10]:
import requests
import json
import random

import getpass
#import couchdb
import pickle
import getpass
#!flask/bin/python
#from flask import Flask, jsonify
In [11]:
myusr = getpass.getuser()
In [12]:
print(myusr)
pi
In [2]:
#couch = couchdb.Server()
In [36]:
with open('/home/{}/prn.pickle'.format(myusr), 'rb') as handle:
    prnlis = pickle.load(handle)
In [13]:
#db = couch.create('redtube')    
In [14]:
#db = couch['redtube']

Requests and json are the two main modules used for this. Random can also be handy

In [15]:
payload = {'output' : 'json', 'data' : 'redtube.Videos.searchVideos', 'page' : 1}
In [16]:
getprn = requests.get('http://api.redtube.com/', params = payload)
In [17]:
daprn = getprn.json()
In [18]:
levid = len(daprn['videos'])
In [19]:
porndick = dict()
In [ ]:
 
In [47]:
#for lev in range(0, levid):
#    print(daprn['videos'][lev]['video'])
#    prntit = (daprn['videos'][lev]['video']['title'])
#    prnnow = prntit.replace(' ', '-')
#    prnlow = prnnow.lower()
#    print(prnlow)
#    try:
#        somelis = list()
#        for dapr in daprn['videos'][lev]['video']['tags']:
#            print(dapr['tag_name'])
#            somelis.append(dapr['tag_name'])
#            porndick.update({daprn['videos'][lev]['video']['video_id'] : {'tags' : ", ".join(str(x) for x in somelis)}})
            #db.save(porndick)
            #try:
            #    db = couch.create(prnlow)
            #except PreconditionFailed:
            #    db = couch[prnlow]
            #db.save({daprn['videos'][lev]['video']['video_id'] : {'tags' : ", ".join(str(x) for x in somelis)}})
            
#    except KeyError:
#        continue
In [18]:
#for i in db:
#    print(i)
b4bd99ab29c1300495b1c1e6dd001590
In [45]:
#db.save(porndick)

#for i in db:
#    print(db[i])
In [54]:
#print(pornd['tags'])
HD
In [8]:
#loaPrn = json.loads(getPrn.text)
#print loaUrl

Convert it into readable text that you can work with

In [28]:
lenvid = len(daprn[u'videos'])
In [29]:
lenvid
Out[29]:
20
In [25]:
#aldic = dict()
In [42]:
with open('/home/{}/prn3.pickle'.format(myusr), 'rb') as handles:
    aldic = pickle.load(handles)
In [26]:
import shutil
In [46]:
for napn in range(0, lenvid):
    print(daprn[u'videos'][napn]['video']['url'])
    print(daprn[u'videos'][napn]['video']['title'])
    try:
        letae = len(daprn[u'videos'][napn]['video']['tags'])
        tagna = (daprn[u'videos'][napn]['video']['tags'])
        reqbru = requests.get('http://api.giphy.com/v1/gifs/translate?s={}&api_key=dc6zaTOxFJmzC'.format(tagna))
        brujsn = reqbru.json()
        print(brujsn['data']['images']['fixed_width']['url'])
        gurl = (brujsn['data']['images']['fixed_width']['url'])
        gslug = (brujsn['data']['slug'])
        #fislg = gslug.repl
        
        try:
            somelis = list()
            for dapr in daprn['videos'][lev]['video']['tags']:
                print(dapr['tag_name'])
                somelis.append(dapr['tag_name'])
                porndick.update({daprn['videos'][lev]['video']['video_id'] : {'tags' : ", ".join(str(x) for x in somelis)}})
 

        except KeyError:
            continue
        
        aldic.update({gslug : gurl})
        #print(gurl)
        '''
        with open('/home/pi/redtube/posts/{}.meta'.format(gslug), 'w') as blmet:
            blmet.write('.. title: ' + glug + ' \n' + '.. slug: ' + nameofblogpost + ' \n' + '.. date: ' + str(nowtime) + ' \n' +  '.. tags: ' + tagblog + '\n' + '.. link:\n.. description:\n.. type: text')
     
        response = requests.get(gurl, stream=True)#
        response
        with open('/home/pi/redtube/galleries/{}.gif'.format(gslug), 'wb') as out_file:
            shutil.copyfileobj(response.raw, out_file)
            del response
            
            tan = tagna.replace(' ', '-')
            tanq = tan.lower()
            print(tanq)
            
        '''
    except KeyError:
         continue                                                                                                                                                                                                                                                                                                                  
https://www.redtube.com/2553368
Two Lusty Brunettes Are Taking No Prisoners
https://media2.giphy.com/media/ftWOFe1fwGPK0/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553359
Mexican patrol Suspect was seen on CCTV
https://media1.giphy.com/media/BXorYDutuBZ84/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553353
Hot red head big tits Crazy slut brought in
https://media0.giphy.com/media/lnGoIzwfI3ja/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2411894
18VR Two Rocky Dicks For Alexis Crystal VR Porn
https://media0.giphy.com/media/DhLHlUF8tFJPG/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2194970
Spiked heeled tease and suck
https://media2.giphy.com/media/3PNJqxf3Gfknm/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553364
Fat man handjob xxx We are the Law my
https://media0.giphy.com/media/KnM6RQ9Dl6wNy/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553360
POV Deepthroat Blowjob with my Teen StepSister till Facial
https://media2.giphy.com/media/3PNJqxf3Gfknm/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553358
Wife Eva turns into anal slut
https://media0.giphy.com/media/t2AesNogfAeXK/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553357
Night fucking doggy
https://media1.giphy.com/media/pp3voGwFpfYS4/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553356
Bikini Goddess Bexxy Gives an Awesome Handjob with Huge Cumshot
https://media3.giphy.com/media/B08TIKuS9DK4U/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553355
Baby Girl Swallows Daddy's Cum For Breakfast
https://media2.giphy.com/media/1gwx36stVpSFi/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553354
Teen Gets Her Pussy &amp; Throat Stretched
https://media3.giphy.com/media/8NSNUBsKux8u4/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553352
Thick Pawg wife gets massive Facial
https://media2.giphy.com/media/ftWOFe1fwGPK0/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553351
Fucking my tight ass
https://media1.giphy.com/media/2ClUBE8YazoeQ/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553350
Sperm Diet Episode 1 - First Time Oral Creampie And Swallow
https://media3.giphy.com/media/11txuXecP6oQJa/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553349
College teen outdoor multiple orgasm - Made in Canarias
https://media2.giphy.com/media/ftWOFe1fwGPK0/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553348
Fucking my GF in her new sexy outfit
https://media1.giphy.com/media/OpwqitsZKzRqU/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553347
Bondage Punishment
https://media3.giphy.com/media/VNh3XsWCN5Dfq/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553346
Caught my Sister's Husband Smelling my Panties and Fuck him
https://media1.giphy.com/media/2ClUBE8YazoeQ/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
https://www.redtube.com/2553345
Insta Girl gets Fucked Hard - Amateur Couple LeoLulu
https://media3.giphy.com/media/yR4ttMO5adaw/200w.gif
Amateur
Big Cock
Blonde
Blowjob
Cum Shot
HD
Teen
Verified Amateurs
In [ ]:
with open('/home/{}/prn.pickle'.format(myusr), 'wb') as handle:
    pickle.dump(porndick, handle, protocol=pickle.HIGHEST_PROTOCOL)
In [41]:
with open('/home/{}/prn3.pickle'.format(myusr), 'wb') as handle:
    pickle.dump(aldic, handle, protocol=pickle.HIGHEST_PROTOCOL)
In [44]:
#db.save(aldic)