Removing files from data analysis directory
[ou-summer-of-code-2017.git] / 10-word-search / wordsearch-exploratory-solving.ipynb
1 {
2 "cells": [
3 {
4 "cell_type": "markdown",
5 "metadata": {},
6 "source": [
7 "# Wordsearch\n",
8 "Given a text file, consisting of three parts (a grid size, a grid, and a list of words), find:\n",
9 "* the words present in the grid, \n",
10 "* the longest word present in the grid, \n",
11 "* the number of words not present in the grid, \n",
12 "* the longest word not present that can be formed from the leftover letters\n",
13 "\n",
14 "The only words that need be considered are the ones given in the list in the puzzle input.\n",
15 "\n",
16 "The puzzle consists of:\n",
17 "1. A line consisting of _w_`x`_h_, where _w_ and _h_ are integers giving the width and height of the grid.\n",
18 "2. The grid itself, consisting of _h_ lines each of _w_ letters.\n",
19 "3. A list of words, one word per line, of arbitrary length. "
20 ]
21 },
22 {
23 "cell_type": "markdown",
24 "metadata": {},
25 "source": [
26 "## Example\n",
27 "\n",
28 "\n",
29 "`...p.mown.\n",
30 ".sdse..ee.\n",
31 ".e.elad.cr\n",
32 "pi.dtir.ah\n",
33 "rzsiwovspu\n",
34 "oawh.kieab\n",
35 "brow.c.rda\n",
36 "ecnotops.r\n",
37 "d.kc.d...b\n",
38 ".staple...`\n",
39 "\n",
40 "```\n",
41 "fhjpamownq\n",
42 "wsdseuqeev\n",
43 "ieaeladhcr\n",
44 "piedtiriah\n",
45 "rzsiwovspu\n",
46 "oawhakieab\n",
47 "browpcfrda\n",
48 "ecnotopssr\n",
49 "dikchdnpnb\n",
50 "bstapleokr\n",
51 "```\n",
52 "\n",
53 "14 words added; 6 directions\n",
54 "\n",
55 "Present: apace cowhides crazies dock knows lived mown pears probed rhubarb rioted staple tops wide\n",
56 "\n",
57 "Decoys: adapting bombing boor brick cackles carnal casino chaplets chump coaster coccyxes coddle collies creels crumbled cunt curds curled curlier deepen demeanor dicier dowses ensuing faddish fest fickler foaming gambol garoting gliding gristle grunts guts ibex impugns instants kielbasy lanyard loamier lugs market meanly minuend misprint mitts molested moonshot mucking oaks olives orgasmic pastrami perfect proceed puckered quashed refined regards retraces revel ridges ringlet scoff shinier siren solaria sprain sunder sunup tamped tapes thirds throw tiller times trains tranquil transfix typesets uric wariness welts whimsy winced winced\n",
58 "\n",
59 "Decoys: fickler, adapting, chump, foaming, molested, carnal, crumbled, guts, minuend, bombing, winced, coccyxes, solaria, shinier, cackles\n",
60 "\n",
61 "All words: adapting, apace, bombing, cackles, carnal, chump, coccyxes, cowhides, crazies, crumbled, dock, fickler, foaming, guts, knows, lived, minuend, molested, mown, pears, probed, rhubarb, rioted, shinier, solaria, staple, tops, wide, winced\n",
62 "\n",
63 "Directions: [('probed', '`(True, 3, 0, <Direction.down: 4>)`'), ('staple', '`(True, 9, 1, <Direction.right: 2>)`'), ('rioted', '`(True, 6, 7, <Direction.upleft: 5>)`'), ('cowhides', '`(True, 8, 3, <Direction.up: 3>)`'), ('tops', '`(True, 7, 4, <Direction.right: 2>)`'), ('knows', '`(True, 8, 2, <Direction.up: 3>)`'), ('lived', '`(True, 2, 4, <Direction.downright: 8>)`'), ('rhubarb', '`(True, 2, 9, <Direction.down: 4>)`'), ('crazies', '`(True, 7, 1, <Direction.up: 3>)`'), ('dock', '`(True, 8, 5, <Direction.up: 3>)`'), ('apace', '`(True, 5, 8, <Direction.up: 3>)`'), ('mown', '`(True, 0, 5, <Direction.right: 2>)`'), ('pears', '`(True, 0, 3, <Direction.downright: 8>)`'), ('wide', '`(True, 4, 4, <Direction.upright: 6>)`')]"
64 ]
65 },
66 {
67 "cell_type": "code",
68 "execution_count": 4,
69 "metadata": {
70 "collapsed": true
71 },
72 "outputs": [],
73 "source": [
74 "import string\n",
75 "import re\n",
76 "import collections\n",
77 "import copy\n",
78 "import os\n",
79 "\n",
80 "from enum import Enum\n",
81 "Direction = Enum('Direction', 'left right up down upleft upright downleft downright')\n",
82 " \n",
83 "delta = {Direction.left: (0, -1),Direction.right: (0, 1), \n",
84 " Direction.up: (-1, 0), Direction.down: (1, 0), \n",
85 " Direction.upleft: (-1, -1), Direction.upright: (-1, 1), \n",
86 " Direction.downleft: (1, -1), Direction.downright: (1, 1)}\n",
87 "\n",
88 "cat = ''.join\n",
89 "wcat = ' '.join\n",
90 "lcat = '\\n'.join"
91 ]
92 },
93 {
94 "cell_type": "code",
95 "execution_count": 5,
96 "metadata": {
97 "collapsed": true
98 },
99 "outputs": [],
100 "source": [
101 "def empty_grid(w, h):\n",
102 " return [['.' for c in range(w)] for r in range(h)]"
103 ]
104 },
105 {
106 "cell_type": "code",
107 "execution_count": 6,
108 "metadata": {
109 "collapsed": true
110 },
111 "outputs": [],
112 "source": [
113 "def show_grid(grid):\n",
114 " return lcat(cat(r) for r in grid)"
115 ]
116 },
117 {
118 "cell_type": "code",
119 "execution_count": 7,
120 "metadata": {
121 "collapsed": true
122 },
123 "outputs": [],
124 "source": [
125 "def indices(grid, r, c, l, d):\n",
126 " dr, dc = delta[d]\n",
127 " w = len(grid[0])\n",
128 " h = len(grid)\n",
129 " inds = [(r + i * dr, c + i * dc) for i in range(l)]\n",
130 " return [(i, j) for i, j in inds\n",
131 " if i >= 0\n",
132 " if j >= 0\n",
133 " if i < h\n",
134 " if j < w]"
135 ]
136 },
137 {
138 "cell_type": "code",
139 "execution_count": 8,
140 "metadata": {
141 "collapsed": true
142 },
143 "outputs": [],
144 "source": [
145 "def gslice(grid, r, c, l, d):\n",
146 " return [grid[i][j] for i, j in indices(grid, r, c, l, d)]"
147 ]
148 },
149 {
150 "cell_type": "code",
151 "execution_count": 9,
152 "metadata": {
153 "collapsed": true
154 },
155 "outputs": [],
156 "source": [
157 "def set_grid(grid, r, c, d, word):\n",
158 " for (i, j), l in zip(indices(grid, r, c, len(word), d), word):\n",
159 " grid[i][j] = l\n",
160 " return grid"
161 ]
162 },
163 {
164 "cell_type": "code",
165 "execution_count": 10,
166 "metadata": {
167 "collapsed": true
168 },
169 "outputs": [],
170 "source": [
171 "def present(grid, word):\n",
172 " w = len(grid[0])\n",
173 " h = len(grid)\n",
174 " for r in range(h):\n",
175 " for c in range(w):\n",
176 " for d in Direction:\n",
177 " if cat(gslice(grid, r, c, len(word), d)) == word:\n",
178 " return True, r, c, d\n",
179 " return False, 0, 0, list(Direction)[0]"
180 ]
181 },
182 {
183 "cell_type": "code",
184 "execution_count": 158,
185 "metadata": {
186 "collapsed": true
187 },
188 "outputs": [],
189 "source": [
190 "def present_many(grid, words):\n",
191 " w = len(grid[0])\n",
192 " h = len(grid)\n",
193 " wordlens = set(len(w) for w in words)\n",
194 " presences = []\n",
195 " for r in range(h):\n",
196 " for c in range(w):\n",
197 " for d in Direction:\n",
198 " for wordlen in wordlens:\n",
199 " word = cat(gslice(grid, r, c, wordlen, d))\n",
200 " if word in words:\n",
201 " presences += [(word, r, c, d)]\n",
202 " return set(presences)"
203 ]
204 },
205 {
206 "cell_type": "code",
207 "execution_count": 141,
208 "metadata": {
209 "collapsed": true
210 },
211 "outputs": [],
212 "source": [
213 "# def present_many(grid, words):\n",
214 "# w = len(grid[0])\n",
215 "# h = len(grid)\n",
216 "# wordlens = set(len(w) for w in words)\n",
217 "# presences = set()\n",
218 "# for r in range(h):\n",
219 "# for c in range(w):\n",
220 "# for d in Direction:\n",
221 "# for wordlen in wordlens:\n",
222 "# word = cat(gslice(grid, r, c, wordlen, d))\n",
223 "# if word in words:\n",
224 "# presences.add((word, r, c, d))\n",
225 "# return presences"
226 ]
227 },
228 {
229 "cell_type": "code",
230 "execution_count": 11,
231 "metadata": {
232 "collapsed": true
233 },
234 "outputs": [],
235 "source": [
236 "def read_wordsearch(fn):\n",
237 " lines = [l.strip() for l in open(fn).readlines()]\n",
238 " w, h = [int(s) for s in lines[0].split('x')]\n",
239 " grid = lines[1:h+1]\n",
240 " words = lines[h+1:]\n",
241 " return w, h, grid, words"
242 ]
243 },
244 {
245 "cell_type": "code",
246 "execution_count": 49,
247 "metadata": {
248 "scrolled": true
249 },
250 "outputs": [
251 {
252 "data": {
253 "text/plain": [
254 "(['pistrohxegniydutslxt',\n",
255 " 'wmregunarbpiledsyuoo',\n",
256 " 'hojminbmutartslrlmgo',\n",
257 " 'isrsdniiekildabolpll',\n",
258 " 'tstsnyekentypkalases',\n",
259 " 'ssnetengcrfetedirgdt',\n",
260 " 'religstasuslatxauner',\n",
261 " 'elgcpgatsklglzistilo',\n",
262 " 'tndlimitationilkasan',\n",
263 " 'aousropedlygiifeniog',\n",
264 " 'kilrprepszffsyzqsrhs',\n",
265 " 'itlaadorableorpccese',\n",
266 " 'noaeewoodedpngmqicnl',\n",
267 " 'gmrtoitailingchelrok',\n",
268 " 'jadsngninetsahtooeic',\n",
269 " 'xeernighestsailarmtu',\n",
270 " 'aeabsolvednscumdfnon',\n",
271 " 'gydammingawlcandornk',\n",
272 " 'hurlerslvkaccxcinosw',\n",
273 " 'iqnanoitacifitrofqqi'],\n",
274 " ['absolved',\n",
275 " 'adorable',\n",
276 " 'aeon',\n",
277 " 'alias',\n",
278 " 'ancestor',\n",
279 " 'baritone',\n",
280 " 'bemusing',\n",
281 " 'blonds',\n",
282 " 'bran',\n",
283 " 'calcite',\n",
284 " 'candor',\n",
285 " 'conciseness',\n",
286 " 'consequent',\n",
287 " 'cuddle',\n",
288 " 'damming',\n",
289 " 'dashboards',\n",
290 " 'despairing',\n",
291 " 'dint',\n",
292 " 'dullard',\n",
293 " 'dynasty',\n",
294 " 'employer',\n",
295 " 'exhorts',\n",
296 " 'feted',\n",
297 " 'fill',\n",
298 " 'flattens',\n",
299 " 'foghorn',\n",
300 " 'fortification',\n",
301 " 'freakish',\n",
302 " 'frolics',\n",
303 " 'gall',\n",
304 " 'gees',\n",
305 " 'genies',\n",
306 " 'gets',\n",
307 " 'hastening',\n",
308 " 'hits',\n",
309 " 'hopelessness',\n",
310 " 'hurlers',\n",
311 " 'impales',\n",
312 " 'infix',\n",
313 " 'inflow',\n",
314 " 'innumerable',\n",
315 " 'intentional',\n",
316 " 'jerkin',\n",
317 " 'justification',\n",
318 " 'kitty',\n",
319 " 'knuckles',\n",
320 " 'leaving',\n",
321 " 'like',\n",
322 " 'limitation',\n",
323 " 'locoweeds',\n",
324 " 'loot',\n",
325 " 'lucking',\n",
326 " 'lumps',\n",
327 " 'mercerising',\n",
328 " 'monickers',\n",
329 " 'motionless',\n",
330 " 'naturally',\n",
331 " 'nighest',\n",
332 " 'notion',\n",
333 " 'ogled',\n",
334 " 'originality',\n",
335 " 'outings',\n",
336 " 'pendulous',\n",
337 " 'piled',\n",
338 " 'pins',\n",
339 " 'pithier',\n",
340 " 'prep',\n",
341 " 'randomness',\n",
342 " 'rectors',\n",
343 " 'redrew',\n",
344 " 'reformulated',\n",
345 " 'remoteness',\n",
346 " 'retaking',\n",
347 " 'rethink',\n",
348 " 'rope',\n",
349 " 'rubier',\n",
350 " 'sailors',\n",
351 " 'scowls',\n",
352 " 'scum',\n",
353 " 'sepals',\n",
354 " 'sequencers',\n",
355 " 'serf',\n",
356 " 'shoaled',\n",
357 " 'shook',\n",
358 " 'sonic',\n",
359 " 'spottiest',\n",
360 " 'stag',\n",
361 " 'stood',\n",
362 " 'stratum',\n",
363 " 'strong',\n",
364 " 'studying',\n",
365 " 'surtaxing',\n",
366 " 'tailing',\n",
367 " 'tears',\n",
368 " 'teazles',\n",
369 " 'vans',\n",
370 " 'wardrobes',\n",
371 " 'wooded',\n",
372 " 'worsts',\n",
373 " 'zings'])"
374 ]
375 },
376 "execution_count": 49,
377 "metadata": {},
378 "output_type": "execute_result"
379 }
380 ],
381 "source": [
382 "width, height, g, ws = read_wordsearch('wordsearch04.txt')\n",
383 "g, ws"
384 ]
385 },
386 {
387 "cell_type": "code",
388 "execution_count": 50,
389 "metadata": {
390 "scrolled": true
391 },
392 "outputs": [
393 {
394 "name": "stdout",
395 "output_type": "stream",
396 "text": [
397 "absolved (True, 16, 2, <Direction.right: 2>)\n",
398 "adorable (True, 11, 4, <Direction.right: 2>)\n",
399 "aeon (True, 11, 4, <Direction.down: 4>)\n",
400 "alias (True, 15, 15, <Direction.left: 1>)\n",
401 "ancestor (False, 0, 0, <Direction.left: 1>)\n",
402 "baritone (False, 0, 0, <Direction.left: 1>)\n",
403 "bemusing (False, 0, 0, <Direction.left: 1>)\n",
404 "blonds (False, 0, 0, <Direction.left: 1>)\n",
405 "bran (True, 1, 9, <Direction.left: 1>)\n",
406 "calcite (True, 19, 9, <Direction.upright: 6>)\n",
407 "candor (True, 17, 12, <Direction.right: 2>)\n",
408 "conciseness (False, 0, 0, <Direction.left: 1>)\n",
409 "consequent (False, 0, 0, <Direction.left: 1>)\n",
410 "cuddle (False, 0, 0, <Direction.left: 1>)\n",
411 "damming (True, 17, 2, <Direction.right: 2>)\n",
412 "dashboards (False, 0, 0, <Direction.left: 1>)\n",
413 "despairing (False, 0, 0, <Direction.left: 1>)\n",
414 "dint (False, 0, 0, <Direction.left: 1>)\n",
415 "dullard (True, 8, 2, <Direction.down: 4>)\n",
416 "dynasty (True, 3, 4, <Direction.downright: 8>)\n",
417 "employer (False, 0, 0, <Direction.left: 1>)\n",
418 "exhorts (True, 0, 8, <Direction.left: 1>)\n",
419 "feted (True, 5, 10, <Direction.right: 2>)\n",
420 "fill (True, 9, 14, <Direction.upleft: 5>)\n",
421 "flattens (True, 10, 10, <Direction.upleft: 5>)\n",
422 "foghorn (True, 10, 11, <Direction.downright: 8>)\n",
423 "fortification (True, 19, 16, <Direction.left: 1>)\n",
424 "freakish (False, 0, 0, <Direction.left: 1>)\n",
425 "frolics (True, 16, 16, <Direction.up: 3>)\n",
426 "gall (False, 0, 0, <Direction.left: 1>)\n",
427 "gees (True, 17, 0, <Direction.upright: 6>)\n",
428 "genies (True, 5, 7, <Direction.upleft: 5>)\n",
429 "gets (True, 6, 4, <Direction.upleft: 5>)\n",
430 "hastening (True, 14, 13, <Direction.left: 1>)\n",
431 "hits (True, 2, 0, <Direction.down: 4>)\n",
432 "hopelessness (False, 0, 0, <Direction.left: 1>)\n",
433 "hurlers (True, 18, 0, <Direction.right: 2>)\n",
434 "impales (False, 0, 0, <Direction.left: 1>)\n",
435 "infix (False, 0, 0, <Direction.left: 1>)\n",
436 "inflow (False, 0, 0, <Direction.left: 1>)\n",
437 "innumerable (False, 0, 0, <Direction.left: 1>)\n",
438 "intentional (False, 0, 0, <Direction.left: 1>)\n",
439 "jerkin (False, 0, 0, <Direction.left: 1>)\n",
440 "justification (False, 0, 0, <Direction.left: 1>)\n",
441 "kitty (True, 8, 15, <Direction.upleft: 5>)\n",
442 "knuckles (True, 17, 19, <Direction.up: 3>)\n",
443 "leaving (False, 0, 0, <Direction.left: 1>)\n",
444 "like (True, 3, 11, <Direction.left: 1>)\n",
445 "limitation (True, 8, 3, <Direction.right: 2>)\n",
446 "locoweeds (False, 0, 0, <Direction.left: 1>)\n",
447 "loot (True, 3, 19, <Direction.up: 3>)\n",
448 "lucking (True, 7, 10, <Direction.upleft: 5>)\n",
449 "lumps (True, 0, 17, <Direction.down: 4>)\n",
450 "mercerising (True, 15, 17, <Direction.up: 3>)\n",
451 "monickers (False, 0, 0, <Direction.left: 1>)\n",
452 "motionless (True, 13, 1, <Direction.up: 3>)\n",
453 "naturally (True, 9, 16, <Direction.up: 3>)\n",
454 "nighest (True, 15, 4, <Direction.right: 2>)\n",
455 "notion (True, 17, 18, <Direction.up: 3>)\n",
456 "ogled (True, 1, 18, <Direction.down: 4>)\n",
457 "originality (False, 0, 0, <Direction.left: 1>)\n",
458 "outings (False, 0, 0, <Direction.left: 1>)\n",
459 "pendulous (False, 0, 0, <Direction.left: 1>)\n",
460 "piled (True, 1, 10, <Direction.right: 2>)\n",
461 "pins (True, 7, 4, <Direction.upleft: 5>)\n",
462 "pithier (False, 0, 0, <Direction.left: 1>)\n",
463 "prep (True, 10, 4, <Direction.right: 2>)\n",
464 "randomness (False, 0, 0, <Direction.left: 1>)\n",
465 "rectors (False, 0, 0, <Direction.left: 1>)\n",
466 "redrew (False, 0, 0, <Direction.left: 1>)\n",
467 "reformulated (False, 0, 0, <Direction.left: 1>)\n",
468 "remoteness (False, 0, 0, <Direction.left: 1>)\n",
469 "retaking (True, 6, 0, <Direction.down: 4>)\n",
470 "rethink (False, 0, 0, <Direction.left: 1>)\n",
471 "rope (True, 9, 4, <Direction.right: 2>)\n",
472 "rubier (True, 0, 4, <Direction.downright: 8>)\n",
473 "sailors (True, 7, 15, <Direction.up: 3>)\n",
474 "scowls (False, 0, 0, <Direction.left: 1>)\n",
475 "scum (True, 16, 11, <Direction.right: 2>)\n",
476 "sepals (True, 6, 10, <Direction.upright: 6>)\n",
477 "sequencers (False, 0, 0, <Direction.left: 1>)\n",
478 "serf (False, 0, 0, <Direction.left: 1>)\n",
479 "shoaled (True, 11, 18, <Direction.up: 3>)\n",
480 "shook (False, 0, 0, <Direction.left: 1>)\n",
481 "sonic (True, 18, 18, <Direction.left: 1>)\n",
482 "spottiest (False, 0, 0, <Direction.left: 1>)\n",
483 "stag (True, 7, 8, <Direction.left: 1>)\n",
484 "stood (False, 0, 0, <Direction.left: 1>)\n",
485 "stratum (True, 2, 13, <Direction.left: 1>)\n",
486 "strong (True, 4, 19, <Direction.down: 4>)\n",
487 "studying (True, 0, 16, <Direction.left: 1>)\n",
488 "surtaxing (False, 0, 0, <Direction.left: 1>)\n",
489 "tailing (True, 13, 6, <Direction.right: 2>)\n",
490 "tears (True, 13, 3, <Direction.up: 3>)\n",
491 "teazles (True, 4, 10, <Direction.downright: 8>)\n",
492 "vans (True, 18, 8, <Direction.upright: 6>)\n",
493 "wardrobes (False, 0, 0, <Direction.left: 1>)\n",
494 "wooded (True, 12, 5, <Direction.right: 2>)\n",
495 "worsts (True, 1, 0, <Direction.downright: 8>)\n",
496 "zings (True, 10, 14, <Direction.upleft: 5>)\n"
497 ]
498 }
499 ],
500 "source": [
501 "for w in ws:\n",
502 " print(w, present(g, w))"
503 ]
504 },
505 {
506 "cell_type": "markdown",
507 "metadata": {},
508 "source": [
509 "Which words are present?"
510 ]
511 },
512 {
513 "cell_type": "code",
514 "execution_count": 69,
515 "metadata": {
516 "scrolled": true
517 },
518 "outputs": [
519 {
520 "name": "stdout",
521 "output_type": "stream",
522 "text": [
523 "1 loop, best of 3: 1.28 s per loop\n"
524 ]
525 }
526 ],
527 "source": [
528 "%%timeit\n",
529 "[w for w in ws if present(g, w)[0]]"
530 ]
531 },
532 {
533 "cell_type": "code",
534 "execution_count": 70,
535 "metadata": {
536 "scrolled": true
537 },
538 "outputs": [
539 {
540 "name": "stdout",
541 "output_type": "stream",
542 "text": [
543 "1 loop, best of 3: 215 ms per loop\n"
544 ]
545 }
546 ],
547 "source": [
548 "%%timeit\n",
549 "[p[0] for p in present_many(g, ws)]"
550 ]
551 },
552 {
553 "cell_type": "code",
554 "execution_count": 61,
555 "metadata": {
556 "scrolled": true
557 },
558 "outputs": [
559 {
560 "data": {
561 "text/plain": [
562 "['absolved',\n",
563 " 'adorable',\n",
564 " 'aeon',\n",
565 " 'alias',\n",
566 " 'ancestor',\n",
567 " 'baritone',\n",
568 " 'bemusing',\n",
569 " 'blonds',\n",
570 " 'bran',\n",
571 " 'calcite',\n",
572 " 'candor',\n",
573 " 'conciseness',\n",
574 " 'consequent',\n",
575 " 'cuddle',\n",
576 " 'damming',\n",
577 " 'dashboards',\n",
578 " 'despairing',\n",
579 " 'dint',\n",
580 " 'dullard',\n",
581 " 'dynasty',\n",
582 " 'employer',\n",
583 " 'exhorts',\n",
584 " 'feted',\n",
585 " 'fill',\n",
586 " 'flattens',\n",
587 " 'foghorn',\n",
588 " 'fortification',\n",
589 " 'freakish',\n",
590 " 'frolics',\n",
591 " 'gall',\n",
592 " 'gees',\n",
593 " 'genies',\n",
594 " 'gets',\n",
595 " 'hastening',\n",
596 " 'hits',\n",
597 " 'hopelessness',\n",
598 " 'hurlers',\n",
599 " 'impales',\n",
600 " 'infix',\n",
601 " 'inflow',\n",
602 " 'innumerable',\n",
603 " 'intentional',\n",
604 " 'jerkin',\n",
605 " 'justification',\n",
606 " 'kitty',\n",
607 " 'knuckles',\n",
608 " 'leaving',\n",
609 " 'like',\n",
610 " 'limitation',\n",
611 " 'locoweeds',\n",
612 " 'loot',\n",
613 " 'lucking',\n",
614 " 'lumps',\n",
615 " 'mercerising',\n",
616 " 'monickers',\n",
617 " 'motionless',\n",
618 " 'naturally',\n",
619 " 'nighest',\n",
620 " 'notion',\n",
621 " 'ogled',\n",
622 " 'originality',\n",
623 " 'outings',\n",
624 " 'pendulous',\n",
625 " 'piled',\n",
626 " 'pins',\n",
627 " 'pithier',\n",
628 " 'prep',\n",
629 " 'randomness',\n",
630 " 'rectors',\n",
631 " 'redrew',\n",
632 " 'reformulated',\n",
633 " 'remoteness',\n",
634 " 'retaking',\n",
635 " 'rethink',\n",
636 " 'rope',\n",
637 " 'rubier',\n",
638 " 'sailors',\n",
639 " 'scowls',\n",
640 " 'scum',\n",
641 " 'sepals',\n",
642 " 'sequencers',\n",
643 " 'serf',\n",
644 " 'shoaled',\n",
645 " 'shook',\n",
646 " 'sonic',\n",
647 " 'spottiest',\n",
648 " 'stag',\n",
649 " 'stood',\n",
650 " 'stratum',\n",
651 " 'strong',\n",
652 " 'studying',\n",
653 " 'surtaxing',\n",
654 " 'tailing',\n",
655 " 'tears',\n",
656 " 'teazles',\n",
657 " 'vans',\n",
658 " 'wardrobes',\n",
659 " 'wooded',\n",
660 " 'worsts',\n",
661 " 'zings']"
662 ]
663 },
664 "execution_count": 61,
665 "metadata": {},
666 "output_type": "execute_result"
667 }
668 ],
669 "source": [
670 "ws"
671 ]
672 },
673 {
674 "cell_type": "markdown",
675 "metadata": {},
676 "source": [
677 "What is the longest word that is present?"
678 ]
679 },
680 {
681 "cell_type": "code",
682 "execution_count": 15,
683 "metadata": {},
684 "outputs": [
685 {
686 "data": {
687 "text/plain": [
688 "'fortification'"
689 ]
690 },
691 "execution_count": 15,
692 "metadata": {},
693 "output_type": "execute_result"
694 }
695 ],
696 "source": [
697 "sorted([w for w in ws if present(g, w)[0]], key=len)[-1]"
698 ]
699 },
700 {
701 "cell_type": "markdown",
702 "metadata": {},
703 "source": [
704 "What is the longest word that is absent?"
705 ]
706 },
707 {
708 "cell_type": "code",
709 "execution_count": 16,
710 "metadata": {},
711 "outputs": [
712 {
713 "data": {
714 "text/plain": [
715 "'justification'"
716 ]
717 },
718 "execution_count": 16,
719 "metadata": {},
720 "output_type": "execute_result"
721 }
722 ],
723 "source": [
724 "sorted([w for w in ws if not present(g, w)[0]], key=len)[-1]"
725 ]
726 },
727 {
728 "cell_type": "markdown",
729 "metadata": {},
730 "source": [
731 "How many letters are unused?"
732 ]
733 },
734 {
735 "cell_type": "code",
736 "execution_count": 17,
737 "metadata": {},
738 "outputs": [
739 {
740 "data": {
741 "text/plain": [
742 "57"
743 ]
744 },
745 "execution_count": 17,
746 "metadata": {},
747 "output_type": "execute_result"
748 }
749 ],
750 "source": [
751 "g0 = empty_grid(width, height)\n",
752 "for w in ws:\n",
753 " p, r, c, d = present(g, w)\n",
754 " if p:\n",
755 " set_grid(g0, r, c, d, w)\n",
756 "len([c for c in cat(cat(l) for l in g0) if c == '.'])"
757 ]
758 },
759 {
760 "cell_type": "markdown",
761 "metadata": {},
762 "source": [
763 "What is the longest word you can make form the leftover letters?"
764 ]
765 },
766 {
767 "cell_type": "code",
768 "execution_count": 18,
769 "metadata": {
770 "scrolled": true
771 },
772 "outputs": [
773 {
774 "data": {
775 "text/plain": [
776 "Counter({'a': 4,\n",
777 " 'b': 1,\n",
778 " 'c': 5,\n",
779 " 'd': 3,\n",
780 " 'e': 1,\n",
781 " 'g': 2,\n",
782 " 'i': 5,\n",
783 " 'j': 2,\n",
784 " 'k': 3,\n",
785 " 'l': 2,\n",
786 " 'm': 3,\n",
787 " 'n': 3,\n",
788 " 'p': 3,\n",
789 " 'q': 5,\n",
790 " 'r': 3,\n",
791 " 's': 3,\n",
792 " 'w': 2,\n",
793 " 'x': 4,\n",
794 " 'y': 2,\n",
795 " 'z': 1})"
796 ]
797 },
798 "execution_count": 18,
799 "metadata": {},
800 "output_type": "execute_result"
801 }
802 ],
803 "source": [
804 "unused_letters = [l for l, u in zip((c for c in cat(cat(l) for l in g)), (c for c in cat(cat(l) for l in g0)))\n",
805 " if u == '.']\n",
806 "unused_letter_count = collections.Counter(unused_letters)\n",
807 "unused_letter_count"
808 ]
809 },
810 {
811 "cell_type": "code",
812 "execution_count": 19,
813 "metadata": {
814 "scrolled": true
815 },
816 "outputs": [
817 {
818 "data": {
819 "text/plain": [
820 "['ancestor',\n",
821 " 'baritone',\n",
822 " 'bemusing',\n",
823 " 'blonds',\n",
824 " 'conciseness',\n",
825 " 'consequent',\n",
826 " 'cuddle',\n",
827 " 'dashboards',\n",
828 " 'despairing',\n",
829 " 'dint',\n",
830 " 'employer',\n",
831 " 'freakish',\n",
832 " 'gall',\n",
833 " 'hopelessness',\n",
834 " 'impales',\n",
835 " 'infix',\n",
836 " 'inflow',\n",
837 " 'innumerable',\n",
838 " 'intentional',\n",
839 " 'jerkin',\n",
840 " 'justification',\n",
841 " 'leaving',\n",
842 " 'locoweeds',\n",
843 " 'monickers',\n",
844 " 'originality',\n",
845 " 'outings',\n",
846 " 'pendulous',\n",
847 " 'pithier',\n",
848 " 'randomness',\n",
849 " 'rectors',\n",
850 " 'redrew',\n",
851 " 'reformulated',\n",
852 " 'remoteness',\n",
853 " 'rethink',\n",
854 " 'scowls',\n",
855 " 'sequencers',\n",
856 " 'serf',\n",
857 " 'shook',\n",
858 " 'spottiest',\n",
859 " 'stood',\n",
860 " 'surtaxing',\n",
861 " 'wardrobes']"
862 ]
863 },
864 "execution_count": 19,
865 "metadata": {},
866 "output_type": "execute_result"
867 }
868 ],
869 "source": [
870 "unused_words = [w for w in ws if not present(g, w)[0]]\n",
871 "unused_words"
872 ]
873 },
874 {
875 "cell_type": "code",
876 "execution_count": 20,
877 "metadata": {
878 "scrolled": true
879 },
880 "outputs": [
881 {
882 "name": "stdout",
883 "output_type": "stream",
884 "text": [
885 "ancestor Counter({'t': 1, 'r': 1, 'a': 1, 'e': 1, 'o': 1, 's': 1, 'n': 1, 'c': 1})\n",
886 "baritone Counter({'t': 1, 'i': 1, 'a': 1, 'e': 1, 'n': 1, 'o': 1, 'r': 1, 'b': 1})\n",
887 "bemusing Counter({'m': 1, 'i': 1, 'n': 1, 'g': 1, 'e': 1, 's': 1, 'u': 1, 'b': 1})\n",
888 "blonds Counter({'n': 1, 's': 1, 'd': 1, 'l': 1, 'o': 1, 'b': 1})\n",
889 "conciseness Counter({'s': 3, 'n': 2, 'e': 2, 'c': 2, 'i': 1, 'o': 1})\n",
890 "consequent Counter({'n': 2, 'e': 2, 't': 1, 'q': 1, 's': 1, 'u': 1, 'o': 1, 'c': 1})\n",
891 "cuddle Counter({'d': 2, 'l': 1, 'u': 1, 'e': 1, 'c': 1})\n",
892 "dashboards Counter({'a': 2, 'd': 2, 's': 2, 'r': 1, 'h': 1, 'o': 1, 'b': 1})\n",
893 "*despairing Counter({'i': 2, 'a': 1, 'g': 1, 'e': 1, 'n': 1, 'r': 1, 'd': 1, 'p': 1, 's': 1})\n",
894 "dint Counter({'t': 1, 'i': 1, 'd': 1, 'n': 1})\n",
895 "employer Counter({'e': 2, 'm': 1, 'r': 1, 'p': 1, 'o': 1, 'l': 1, 'y': 1})\n",
896 "freakish Counter({'i': 1, 'a': 1, 'h': 1, 'e': 1, 's': 1, 'r': 1, 'f': 1, 'k': 1})\n",
897 "*gall Counter({'l': 2, 'a': 1, 'g': 1})\n",
898 "hopelessness Counter({'s': 4, 'e': 3, 'n': 1, 'h': 1, 'l': 1, 'o': 1, 'p': 1})\n",
899 "*impales Counter({'m': 1, 'i': 1, 'a': 1, 'e': 1, 'p': 1, 's': 1, 'l': 1})\n",
900 "infix Counter({'i': 2, 'x': 1, 'n': 1, 'f': 1})\n",
901 "inflow Counter({'i': 1, 'n': 1, 'w': 1, 'o': 1, 'f': 1, 'l': 1})\n",
902 "innumerable Counter({'n': 2, 'e': 2, 'm': 1, 'i': 1, 'l': 1, 'r': 1, 'u': 1, 'a': 1, 'b': 1})\n",
903 "intentional Counter({'n': 3, 't': 2, 'i': 2, 'e': 1, 'o': 1, 'l': 1, 'a': 1})\n",
904 "*jerkin Counter({'i': 1, 'n': 1, 'e': 1, 'r': 1, 'j': 1, 'k': 1})\n",
905 "justification Counter({'i': 3, 't': 2, 'a': 1, 'n': 1, 'j': 1, 's': 1, 'f': 1, 'u': 1, 'o': 1, 'c': 1})\n",
906 "leaving Counter({'i': 1, 'a': 1, 'v': 1, 'e': 1, 'l': 1, 'n': 1, 'g': 1})\n",
907 "locoweeds Counter({'e': 2, 'o': 2, 's': 1, 'w': 1, 'l': 1, 'd': 1, 'c': 1})\n",
908 "monickers Counter({'m': 1, 'i': 1, 'n': 1, 'e': 1, 's': 1, 'r': 1, 'o': 1, 'k': 1, 'c': 1})\n",
909 "originality Counter({'i': 3, 't': 1, 'n': 1, 'g': 1, 'y': 1, 'o': 1, 'a': 1, 'l': 1, 'r': 1})\n",
910 "outings Counter({'t': 1, 'i': 1, 'n': 1, 'g': 1, 'o': 1, 'u': 1, 's': 1})\n",
911 "pendulous Counter({'u': 2, 'n': 1, 'l': 1, 'e': 1, 's': 1, 'p': 1, 'd': 1, 'o': 1})\n",
912 "pithier Counter({'i': 2, 't': 1, 'h': 1, 'e': 1, 'r': 1, 'p': 1})\n",
913 "randomness Counter({'n': 2, 's': 2, 'm': 1, 'a': 1, 'e': 1, 'r': 1, 'o': 1, 'd': 1})\n",
914 "rectors Counter({'r': 2, 't': 1, 'e': 1, 's': 1, 'o': 1, 'c': 1})\n",
915 "redrew Counter({'r': 2, 'e': 2, 'w': 1, 'd': 1})\n",
916 "reformulated Counter({'e': 2, 'r': 2, 'm': 1, 't': 1, 'a': 1, 'd': 1, 'l': 1, 'f': 1, 'u': 1, 'o': 1})\n",
917 "remoteness Counter({'e': 3, 's': 2, 'm': 1, 't': 1, 'n': 1, 'r': 1, 'o': 1})\n",
918 "rethink Counter({'t': 1, 'i': 1, 'n': 1, 'h': 1, 'e': 1, 'r': 1, 'k': 1})\n",
919 "scowls Counter({'s': 2, 'w': 1, 'l': 1, 'o': 1, 'c': 1})\n",
920 "sequencers Counter({'e': 3, 's': 2, 'n': 1, 'q': 1, 'u': 1, 'r': 1, 'c': 1})\n",
921 "serf Counter({'s': 1, 'f': 1, 'r': 1, 'e': 1})\n",
922 "shook Counter({'o': 2, 'k': 1, 's': 1, 'h': 1})\n",
923 "spottiest Counter({'t': 3, 's': 2, 'i': 1, 'e': 1, 'p': 1, 'o': 1})\n",
924 "stood Counter({'o': 2, 't': 1, 'd': 1, 's': 1})\n",
925 "surtaxing Counter({'t': 1, 'x': 1, 'a': 1, 'g': 1, 'i': 1, 'n': 1, 's': 1, 'u': 1, 'r': 1})\n",
926 "wardrobes Counter({'r': 2, 'o': 1, 'a': 1, 'e': 1, 's': 1, 'w': 1, 'd': 1, 'b': 1})\n"
927 ]
928 }
929 ],
930 "source": [
931 "makeable_words = []\n",
932 "for w in unused_words:\n",
933 " unused_word_count = collections.Counter(w)\n",
934 " if all(unused_word_count[l] <= unused_letter_count[l] for l in unused_word_count):\n",
935 " makeable_words += [w]\n",
936 " print('*', end='')\n",
937 " print(w, unused_word_count)"
938 ]
939 },
940 {
941 "cell_type": "code",
942 "execution_count": 21,
943 "metadata": {},
944 "outputs": [
945 {
946 "data": {
947 "text/plain": [
948 "10"
949 ]
950 },
951 "execution_count": 21,
952 "metadata": {},
953 "output_type": "execute_result"
954 }
955 ],
956 "source": [
957 "max(len(w) for w in makeable_words)"
958 ]
959 },
960 {
961 "cell_type": "code",
962 "execution_count": 22,
963 "metadata": {},
964 "outputs": [
965 {
966 "data": {
967 "text/plain": [
968 "'despairing'"
969 ]
970 },
971 "execution_count": 22,
972 "metadata": {},
973 "output_type": "execute_result"
974 }
975 ],
976 "source": [
977 "sorted(makeable_words, key=len)[-1]"
978 ]
979 },
980 {
981 "cell_type": "code",
982 "execution_count": 78,
983 "metadata": {
984 "collapsed": true
985 },
986 "outputs": [],
987 "source": [
988 "def do_wordsearch_tasks_old(fn, show_anyway=False):\n",
989 " width, height, grid, words = read_wordsearch(fn)\n",
990 "\n",
991 " used_words = [w for w in words if present(grid, w)[0]]\n",
992 " unused_words = [w for w in words if not present(grid, w)[0]]\n",
993 " lwp = sorted([w for w in words if present(grid, w)[0]], key=len)[-1]\n",
994 " lwa = sorted(unused_words, key=len)[-1]\n",
995 "\n",
996 " lwps = [w for w in used_words if len(w) == len(lwp)]\n",
997 " lwas = [w for w in unused_words if len(w) == len(lwa)]\n",
998 " g0 = empty_grid(width, height)\n",
999 " for w in words:\n",
1000 " p, r, c, d = present(grid, w)\n",
1001 " if p:\n",
1002 " set_grid(g0, r, c, d, w) \n",
1003 " unused_letters = [l for l, u in zip((c for c in cat(cat(l) for l in grid)), (c for c in cat(cat(l) for l in g0)))\n",
1004 " if u == '.']\n",
1005 " unused_letter_count = collections.Counter(unused_letters)\n",
1006 " makeable_words = []\n",
1007 " for w in unused_words:\n",
1008 " unused_word_count = collections.Counter(w)\n",
1009 " if all(unused_word_count[l] <= unused_letter_count[l] for l in unused_word_count):\n",
1010 " makeable_words += [w]\n",
1011 " lwm = sorted(makeable_words, key=len)[-1]\n",
1012 " lwms = [w for w in makeable_words if len(w) == len(lwm)]\n",
1013 " if show_anyway or len(set((len(lwp),len(lwa),len(lwm)))) == 3:\n",
1014 " print('\\n{}'.format(fn))\n",
1015 " print('{} words present'.format(len(words) - len(unused_words)))\n",
1016 " print('Longest word present: {}, {} letters ({})'.format(lwp, len(lwp), lwps))\n",
1017 " print('Longest word absent: {}, {} letters ({})'.format(lwa, len(lwa), lwas))\n",
1018 " print('{} unused letters'.format(len([c for c in cat(cat(l) for l in g0) if c == '.'])))\n",
1019 " print('Longest makeable word: {}, {} ({})'.format(lwm, len(lwm), lwms))\n",
1020 " print('All makeable words: {}'.format(makeable_words))"
1021 ]
1022 },
1023 {
1024 "cell_type": "code",
1025 "execution_count": 152,
1026 "metadata": {
1027 "collapsed": true
1028 },
1029 "outputs": [],
1030 "source": [
1031 "def do_wordsearch_tasks(fn, show_anyway=False, show_all_makeable=False):\n",
1032 " width, height, grid, words = read_wordsearch(fn)\n",
1033 " used_words = [p[0] for p in present_many(grid, words)]\n",
1034 " unused_words = [w for w in words if w not in used_words]\n",
1035 " lwp = max(used_words, key=len)\n",
1036 " lwps = [w for w in used_words if len(w) == len(lwp)]\n",
1037 "\n",
1038 " if unused_words:\n",
1039 " lwa = max(unused_words, key=len)\n",
1040 " \n",
1041 " lwas = [w for w in unused_words if len(w) == len(lwa)]\n",
1042 " unused_grid = [[c for c in r] for r in grid]\n",
1043 " for w, r, c, d in present_many(grid, words):\n",
1044 " set_grid(unused_grid, r, c, d, '.' * len(w))\n",
1045 " unused_letters = [c for l in unused_grid for c in l if c != '.']\n",
1046 " unused_letter_count = collections.Counter(unused_letters)\n",
1047 " makeable_words = []\n",
1048 " for w in unused_words:\n",
1049 " unused_word_count = collections.Counter(w)\n",
1050 " if all(unused_word_count[l] <= unused_letter_count[l] for l in unused_word_count):\n",
1051 " makeable_words += [w]\n",
1052 " lwm = sorted(makeable_words, key=len)[-1]\n",
1053 " lwms = [w for w in makeable_words if len(w) == len(lwm)]\n",
1054 " else:\n",
1055 " lwa = ''\n",
1056 " lwas = []\n",
1057 " lwm = ''\n",
1058 " lwms = []\n",
1059 " \n",
1060 " if show_anyway or len(set((len(lwp),len(lwa),len(lwm)))) == 3:\n",
1061 " print('\\n{}'.format(fn))\n",
1062 " print('{} words present'.format(len(words) - len(unused_words)))\n",
1063 " print('Longest word present: {}, {} letters ({})'.format(lwp, len(lwp), lwps))\n",
1064 " print('Longest word absent: {}, {} letters ({})'.format(lwa, len(lwa), lwas))\n",
1065 " print('{} unused letters'.format(len([c for c in cat(cat(l) for l in g0) if c == '.'])))\n",
1066 " print('Longest makeable word: {}, {} ({})'.format(lwm, len(lwm), lwms))\n",
1067 " if show_all_makeable:\n",
1068 " print('All makeable words: {}'.format(makeable_words))"
1069 ]
1070 },
1071 {
1072 "cell_type": "code",
1073 "execution_count": 136,
1074 "metadata": {},
1075 "outputs": [
1076 {
1077 "name": "stdout",
1078 "output_type": "stream",
1079 "text": [
1080 "\n",
1081 "wordsearch04.txt\n",
1082 "58 words present\n",
1083 "Longest word present: fortification, 13 letters (['fortification'])\n",
1084 "Longest word absent: justification, 13 letters (['justification'])\n",
1085 "57 unused letters\n",
1086 "Longest makeable word: despairing, 10 (['despairing'])\n",
1087 "All makeable words: ['despairing', 'gall', 'impales', 'jerkin']\n"
1088 ]
1089 }
1090 ],
1091 "source": [
1092 "do_wordsearch_tasks('wordsearch04.txt', show_anyway=True)"
1093 ]
1094 },
1095 {
1096 "cell_type": "code",
1097 "execution_count": 25,
1098 "metadata": {},
1099 "outputs": [
1100 {
1101 "name": "stdout",
1102 "output_type": "stream",
1103 "text": [
1104 "\n",
1105 "wordsearch08.txt\n",
1106 "62 words present\n",
1107 "Longest word present: compassionately, 15 letters (['compassionately'])\n",
1108 "Longest word absent: retrospectives, 14 letters (['retrospectives'])\n",
1109 "65 unused letters\n",
1110 "Longest makeable word: vacationing, 11 (['vacationing'])\n"
1111 ]
1112 }
1113 ],
1114 "source": [
1115 "do_wordsearch_tasks('wordsearch08.txt')"
1116 ]
1117 },
1118 {
1119 "cell_type": "code",
1120 "execution_count": 80,
1121 "metadata": {},
1122 "outputs": [
1123 {
1124 "name": "stdout",
1125 "output_type": "stream",
1126 "text": [
1127 "\n",
1128 "04-wordsearch.txt\n",
1129 "62 words present\n",
1130 "Longest word present: compassionately, 15 letters (['compassionately'])\n",
1131 "Longest word absent: retrospectives, 14 letters (['retrospectives'])\n",
1132 "65 unused letters\n",
1133 "Longest makeable word: vacationing, 11 (['vacationing'])\n",
1134 "All makeable words: ['albacore', 'entail', 'excreted', 'factory', 'fornicated', 'frantic', 'grease', 'grocer', 'informal', 'jockeying', 'justified', 'larynxes', 'reviewer', 'sabotage', 'unable', 'vacationing', 'widgeons']\n",
1135 "\n",
1136 "04-wordsearch.txt\n",
1137 "62 words present\n",
1138 "Longest word present: compassionately, 15 letters (['compassionately'])\n",
1139 "Longest word absent: retrospectives, 14 letters (['retrospectives'])\n",
1140 "65 unused letters\n",
1141 "Longest makeable word: vacationing, 11 (['vacationing'])\n",
1142 "All makeable words: ['albacore', 'entail', 'excreted', 'factory', 'fornicated', 'frantic', 'grease', 'grocer', 'informal', 'jockeying', 'justified', 'larynxes', 'reviewer', 'sabotage', 'unable', 'vacationing', 'widgeons']\n",
1143 "\n",
1144 "04-wordsearch.txt\n",
1145 "62 words present\n",
1146 "Longest word present: compassionately, 15 letters (['compassionately'])\n",
1147 "Longest word absent: retrospectives, 14 letters (['retrospectives'])\n",
1148 "65 unused letters\n",
1149 "Longest makeable word: vacationing, 11 (['vacationing'])\n",
1150 "All makeable words: ['albacore', 'entail', 'excreted', 'factory', 'fornicated', 'frantic', 'grease', 'grocer', 'informal', 'jockeying', 'justified', 'larynxes', 'reviewer', 'sabotage', 'unable', 'vacationing', 'widgeons']\n",
1151 "\n",
1152 "04-wordsearch.txt\n",
1153 "62 words present\n",
1154 "Longest word present: compassionately, 15 letters (['compassionately'])\n",
1155 "Longest word absent: retrospectives, 14 letters (['retrospectives'])\n",
1156 "65 unused letters\n",
1157 "Longest makeable word: vacationing, 11 (['vacationing'])\n",
1158 "All makeable words: ['albacore', 'entail', 'excreted', 'factory', 'fornicated', 'frantic', 'grease', 'grocer', 'informal', 'jockeying', 'justified', 'larynxes', 'reviewer', 'sabotage', 'unable', 'vacationing', 'widgeons']\n",
1159 "1 loop, best of 3: 4.78 s per loop\n"
1160 ]
1161 }
1162 ],
1163 "source": [
1164 "%%timeit\n",
1165 "do_wordsearch_tasks_old('04-wordsearch.txt')"
1166 ]
1167 },
1168 {
1169 "cell_type": "code",
1170 "execution_count": 142,
1171 "metadata": {},
1172 "outputs": [
1173 {
1174 "name": "stdout",
1175 "output_type": "stream",
1176 "text": [
1177 "1 loop, best of 3: 463 ms per loop\n"
1178 ]
1179 }
1180 ],
1181 "source": [
1182 "%%timeit\n",
1183 "do_wordsearch_tasks('wordsearch04.txt')"
1184 ]
1185 },
1186 {
1187 "cell_type": "code",
1188 "execution_count": 143,
1189 "metadata": {},
1190 "outputs": [
1191 {
1192 "name": "stdout",
1193 "output_type": "stream",
1194 "text": [
1195 "\n",
1196 "example-wordsearch.txt\n",
1197 "14 words present\n",
1198 "Longest word present: cowhides, 8 letters (['cowhides'])\n",
1199 "Longest word absent: adapting, 8 letters (['adapting', 'coccyxes', 'crumbled', 'molested'])\n",
1200 "57 unused letters\n",
1201 "Longest makeable word: shinier, 7 (['shinier'])\n",
1202 "All makeable words: ['shinier']\n"
1203 ]
1204 }
1205 ],
1206 "source": [
1207 "do_wordsearch_tasks('example-wordsearch.txt', show_anyway=True)"
1208 ]
1209 },
1210 {
1211 "cell_type": "code",
1212 "execution_count": 26,
1213 "metadata": {
1214 "collapsed": true,
1215 "scrolled": true
1216 },
1217 "outputs": [],
1218 "source": [
1219 "# for fn in sorted(os.listdir()):\n",
1220 "# if re.match('wordsearch\\d\\d\\.txt', fn):\n",
1221 "# do_wordsearch_tasks(fn)"
1222 ]
1223 },
1224 {
1225 "cell_type": "code",
1226 "execution_count": 27,
1227 "metadata": {
1228 "scrolled": true
1229 },
1230 "outputs": [
1231 {
1232 "name": "stdout",
1233 "output_type": "stream",
1234 "text": [
1235 "absolved (True, 16, 2, <Direction.right: 2>)\n",
1236 "adorable (True, 11, 4, <Direction.right: 2>)\n",
1237 "aeon (True, 11, 4, <Direction.down: 4>)\n",
1238 "alias (True, 15, 15, <Direction.left: 1>)\n",
1239 "ancestor (False, 0, 0, <Direction.left: 1>)\n",
1240 "baritone (False, 0, 0, <Direction.left: 1>)\n",
1241 "bemusing (False, 0, 0, <Direction.left: 1>)\n",
1242 "blonds (False, 0, 0, <Direction.left: 1>)\n",
1243 "bran (True, 1, 9, <Direction.left: 1>)\n",
1244 "calcite (True, 19, 9, <Direction.upright: 6>)\n",
1245 "candor (True, 17, 12, <Direction.right: 2>)\n",
1246 "conciseness (False, 0, 0, <Direction.left: 1>)\n",
1247 "consequent (False, 0, 0, <Direction.left: 1>)\n",
1248 "cuddle (False, 0, 0, <Direction.left: 1>)\n",
1249 "damming (True, 17, 2, <Direction.right: 2>)\n",
1250 "dashboards (False, 0, 0, <Direction.left: 1>)\n",
1251 "despairing (False, 0, 0, <Direction.left: 1>)\n",
1252 "dint (False, 0, 0, <Direction.left: 1>)\n",
1253 "dullard (True, 8, 2, <Direction.down: 4>)\n",
1254 "dynasty (True, 3, 4, <Direction.downright: 8>)\n",
1255 "employer (False, 0, 0, <Direction.left: 1>)\n",
1256 "exhorts (True, 0, 8, <Direction.left: 1>)\n",
1257 "feted (True, 5, 10, <Direction.right: 2>)\n",
1258 "fill (True, 9, 14, <Direction.upleft: 5>)\n",
1259 "flattens (True, 10, 10, <Direction.upleft: 5>)\n",
1260 "foghorn (True, 10, 11, <Direction.downright: 8>)\n",
1261 "fortification (True, 19, 16, <Direction.left: 1>)\n",
1262 "freakish (False, 0, 0, <Direction.left: 1>)\n",
1263 "frolics (True, 16, 16, <Direction.up: 3>)\n",
1264 "gall (False, 0, 0, <Direction.left: 1>)\n",
1265 "gees (True, 17, 0, <Direction.upright: 6>)\n",
1266 "genies (True, 5, 7, <Direction.upleft: 5>)\n",
1267 "gets (True, 6, 4, <Direction.upleft: 5>)\n",
1268 "hastening (True, 14, 13, <Direction.left: 1>)\n",
1269 "hits (True, 2, 0, <Direction.down: 4>)\n",
1270 "hopelessness (False, 0, 0, <Direction.left: 1>)\n",
1271 "hurlers (True, 18, 0, <Direction.right: 2>)\n",
1272 "impales (False, 0, 0, <Direction.left: 1>)\n",
1273 "infix (False, 0, 0, <Direction.left: 1>)\n",
1274 "inflow (False, 0, 0, <Direction.left: 1>)\n",
1275 "innumerable (False, 0, 0, <Direction.left: 1>)\n",
1276 "intentional (False, 0, 0, <Direction.left: 1>)\n",
1277 "jerkin (False, 0, 0, <Direction.left: 1>)\n",
1278 "justification (False, 0, 0, <Direction.left: 1>)\n",
1279 "kitty (True, 8, 15, <Direction.upleft: 5>)\n",
1280 "knuckles (True, 17, 19, <Direction.up: 3>)\n",
1281 "leaving (False, 0, 0, <Direction.left: 1>)\n",
1282 "like (True, 3, 11, <Direction.left: 1>)\n",
1283 "limitation (True, 8, 3, <Direction.right: 2>)\n",
1284 "locoweeds (False, 0, 0, <Direction.left: 1>)\n",
1285 "loot (True, 3, 19, <Direction.up: 3>)\n",
1286 "lucking (True, 7, 10, <Direction.upleft: 5>)\n",
1287 "lumps (True, 0, 17, <Direction.down: 4>)\n",
1288 "mercerising (True, 15, 17, <Direction.up: 3>)\n",
1289 "monickers (False, 0, 0, <Direction.left: 1>)\n",
1290 "motionless (True, 13, 1, <Direction.up: 3>)\n",
1291 "naturally (True, 9, 16, <Direction.up: 3>)\n",
1292 "nighest (True, 15, 4, <Direction.right: 2>)\n",
1293 "notion (True, 17, 18, <Direction.up: 3>)\n",
1294 "ogled (True, 1, 18, <Direction.down: 4>)\n",
1295 "originality (False, 0, 0, <Direction.left: 1>)\n",
1296 "outings (False, 0, 0, <Direction.left: 1>)\n",
1297 "pendulous (False, 0, 0, <Direction.left: 1>)\n",
1298 "piled (True, 1, 10, <Direction.right: 2>)\n",
1299 "pins (True, 7, 4, <Direction.upleft: 5>)\n",
1300 "pithier (False, 0, 0, <Direction.left: 1>)\n",
1301 "prep (True, 10, 4, <Direction.right: 2>)\n",
1302 "randomness (False, 0, 0, <Direction.left: 1>)\n",
1303 "rectors (False, 0, 0, <Direction.left: 1>)\n",
1304 "redrew (False, 0, 0, <Direction.left: 1>)\n",
1305 "reformulated (False, 0, 0, <Direction.left: 1>)\n",
1306 "remoteness (False, 0, 0, <Direction.left: 1>)\n",
1307 "retaking (True, 6, 0, <Direction.down: 4>)\n",
1308 "rethink (False, 0, 0, <Direction.left: 1>)\n",
1309 "rope (True, 9, 4, <Direction.right: 2>)\n",
1310 "rubier (True, 0, 4, <Direction.downright: 8>)\n",
1311 "sailors (True, 7, 15, <Direction.up: 3>)\n",
1312 "scowls (False, 0, 0, <Direction.left: 1>)\n",
1313 "scum (True, 16, 11, <Direction.right: 2>)\n",
1314 "sepals (True, 6, 10, <Direction.upright: 6>)\n",
1315 "sequencers (False, 0, 0, <Direction.left: 1>)\n",
1316 "serf (False, 0, 0, <Direction.left: 1>)\n",
1317 "shoaled (True, 11, 18, <Direction.up: 3>)\n",
1318 "shook (False, 0, 0, <Direction.left: 1>)\n",
1319 "sonic (True, 18, 18, <Direction.left: 1>)\n",
1320 "spottiest (False, 0, 0, <Direction.left: 1>)\n",
1321 "stag (True, 7, 8, <Direction.left: 1>)\n",
1322 "stood (False, 0, 0, <Direction.left: 1>)\n",
1323 "stratum (True, 2, 13, <Direction.left: 1>)\n",
1324 "strong (True, 4, 19, <Direction.down: 4>)\n",
1325 "studying (True, 0, 16, <Direction.left: 1>)\n",
1326 "surtaxing (False, 0, 0, <Direction.left: 1>)\n",
1327 "tailing (True, 13, 6, <Direction.right: 2>)\n",
1328 "tears (True, 13, 3, <Direction.up: 3>)\n",
1329 "teazles (True, 4, 10, <Direction.downright: 8>)\n",
1330 "vans (True, 18, 8, <Direction.upright: 6>)\n",
1331 "wardrobes (False, 0, 0, <Direction.left: 1>)\n",
1332 "wooded (True, 12, 5, <Direction.right: 2>)\n",
1333 "worsts (True, 1, 0, <Direction.downright: 8>)\n",
1334 "zings (True, 10, 14, <Direction.upleft: 5>)\n"
1335 ]
1336 }
1337 ],
1338 "source": [
1339 "width, height, grid, words = read_wordsearch('wordsearch04.txt')\n",
1340 "for w in words:\n",
1341 " print(w, present(grid, w))"
1342 ]
1343 },
1344 {
1345 "cell_type": "markdown",
1346 "metadata": {},
1347 "source": [
1348 "## Example for question text"
1349 ]
1350 },
1351 {
1352 "cell_type": "code",
1353 "execution_count": 28,
1354 "metadata": {
1355 "collapsed": true
1356 },
1357 "outputs": [],
1358 "source": [
1359 "import copy\n",
1360 "grid = [['.', '.', '.', 'p', '.', 'm', 'o', 'w', 'n', '.'], ['.', 's', 'd', 's', 'e', '.', '.', 'e', 'e', '.'], ['.', 'e', '.', 'e', 'l', 'a', 'd', '.', 'c', 'r'], ['p', 'i', '.', 'd', 't', 'i', 'r', '.', 'a', 'h'], ['r', 'z', 's', 'i', 'w', 'o', 'v', 's', 'p', 'u'], ['o', 'a', 'w', 'h', '.', 'k', 'i', 'e', 'a', 'b'], ['b', 'r', 'o', 'w', '.', 'c', '.', 'r', 'd', 'a'], ['e', 'c', 'n', 'o', 't', 'o', 'p', 's', '.', 'r'], ['d', '.', 'k', 'c', '.', 'd', '.', '.', '.', 'b'], ['.', 's', 't', 'a', 'p', 'l', 'e', '.', '.', '.']]\n",
1361 "padded_grid = [['f', 'h', 'j', 'p', 'a', 'm', 'o', 'w', 'n', 'q'], ['w', 's', 'd', 's', 'e', 'u', 'q', 'e', 'e', 'v'], ['i', 'e', 'a', 'e', 'l', 'a', 'd', 'h', 'c', 'r'], ['p', 'i', 'e', 'd', 't', 'i', 'r', 'i', 'a', 'h'], ['r', 'z', 's', 'i', 'w', 'o', 'v', 's', 'p', 'u'], ['o', 'a', 'w', 'h', 'a', 'k', 'i', 'e', 'a', 'b'], ['b', 'r', 'o', 'w', 'p', 'c', 'f', 'r', 'd', 'a'], ['e', 'c', 'n', 'o', 't', 'o', 'p', 's', 's', 'r'], ['d', 'i', 'k', 'c', 'h', 'd', 'n', 'p', 'n', 'b'], ['b', 's', 't', 'a', 'p', 'l', 'e', 'o', 'k', 'r']]\n",
1362 "present_words = ['probed', 'staple', 'rioted', 'cowhides', 'tops', 'knows', 'lived', 'rhubarb', 'crazies', 'dock', 'apace', 'mown', 'pears', 'wide']\n",
1363 "decoy_words = ['fickler', 'adapting', 'chump', 'foaming', 'molested', 'carnal', 'crumbled', 'guts', 'minuend', 'bombing', 'winced', 'coccyxes', 'solaria', 'shinier', 'cackles']"
1364 ]
1365 },
1366 {
1367 "cell_type": "code",
1368 "execution_count": 29,
1369 "metadata": {},
1370 "outputs": [
1371 {
1372 "data": {
1373 "text/plain": [
1374 "'apace, cowhides, crazies, dock, knows, lived, mown, pears, probed, rhubarb, rioted, staple, tops, wide'"
1375 ]
1376 },
1377 "execution_count": 29,
1378 "metadata": {},
1379 "output_type": "execute_result"
1380 }
1381 ],
1382 "source": [
1383 "', '.join(sorted(present_words))"
1384 ]
1385 },
1386 {
1387 "cell_type": "code",
1388 "execution_count": 92,
1389 "metadata": {},
1390 "outputs": [
1391 {
1392 "data": {
1393 "text/plain": [
1394 "(76, 14)"
1395 ]
1396 },
1397 "execution_count": 92,
1398 "metadata": {},
1399 "output_type": "execute_result"
1400 }
1401 ],
1402 "source": [
1403 "sum(len(w) for w in present_words), len(present_words)"
1404 ]
1405 },
1406 {
1407 "cell_type": "code",
1408 "execution_count": 30,
1409 "metadata": {},
1410 "outputs": [
1411 {
1412 "data": {
1413 "text/plain": [
1414 "'adapting, bombing, cackles, carnal, chump, coccyxes, crumbled, fickler, foaming, guts, minuend, molested, shinier, solaria, winced'"
1415 ]
1416 },
1417 "execution_count": 30,
1418 "metadata": {},
1419 "output_type": "execute_result"
1420 }
1421 ],
1422 "source": [
1423 "', '.join(sorted(decoy_words))"
1424 ]
1425 },
1426 {
1427 "cell_type": "code",
1428 "execution_count": 31,
1429 "metadata": {},
1430 "outputs": [
1431 {
1432 "data": {
1433 "text/plain": [
1434 "'adapting, apace, bombing, cackles, carnal, chump, coccyxes, cowhides, crazies, crumbled, dock, fickler, foaming, guts, knows, lived, minuend, molested, mown, pears, probed, rhubarb, rioted, shinier, solaria, staple, tops, wide, winced'"
1435 ]
1436 },
1437 "execution_count": 31,
1438 "metadata": {},
1439 "output_type": "execute_result"
1440 }
1441 ],
1442 "source": [
1443 "', '.join(sorted(present_words + decoy_words))"
1444 ]
1445 },
1446 {
1447 "cell_type": "code",
1448 "execution_count": 32,
1449 "metadata": {},
1450 "outputs": [
1451 {
1452 "name": "stdout",
1453 "output_type": "stream",
1454 "text": [
1455 "probed 3 0 6 Direction.down [(3, 0), (4, 0), (5, 0), (6, 0), (7, 0), (8, 0)]\n",
1456 "staple 9 1 6 Direction.right [(9, 1), (9, 2), (9, 3), (9, 4), (9, 5), (9, 6)]\n",
1457 "rioted 6 7 6 Direction.upleft [(6, 7), (5, 6), (4, 5), (3, 4), (2, 3), (1, 2)]\n",
1458 "cowhides 8 3 8 Direction.up [(8, 3), (7, 3), (6, 3), (5, 3), (4, 3), (3, 3), (2, 3), (1, 3)]\n",
1459 "tops 7 4 4 Direction.right [(7, 4), (7, 5), (7, 6), (7, 7)]\n",
1460 "knows 8 2 5 Direction.up [(8, 2), (7, 2), (6, 2), (5, 2), (4, 2)]\n",
1461 "lived 2 4 5 Direction.downright [(2, 4), (3, 5), (4, 6), (5, 7), (6, 8)]\n",
1462 "rhubarb 2 9 7 Direction.down [(2, 9), (3, 9), (4, 9), (5, 9), (6, 9), (7, 9), (8, 9)]\n",
1463 "crazies 7 1 7 Direction.up [(7, 1), (6, 1), (5, 1), (4, 1), (3, 1), (2, 1), (1, 1)]\n",
1464 "dock 8 5 4 Direction.up [(8, 5), (7, 5), (6, 5), (5, 5)]\n",
1465 "apace 5 8 5 Direction.up [(5, 8), (4, 8), (3, 8), (2, 8), (1, 8)]\n",
1466 "mown 0 5 4 Direction.right [(0, 5), (0, 6), (0, 7), (0, 8)]\n",
1467 "pears 0 3 5 Direction.downright [(0, 3), (1, 4), (2, 5), (3, 6), (4, 7)]\n",
1468 "wide 4 4 4 Direction.upright [(4, 4), (3, 5), (2, 6), (1, 7)]\n"
1469 ]
1470 },
1471 {
1472 "data": {
1473 "text/plain": [
1474 "[(7, 5), (2, 3), (3, 5)]"
1475 ]
1476 },
1477 "execution_count": 32,
1478 "metadata": {},
1479 "output_type": "execute_result"
1480 }
1481 ],
1482 "source": [
1483 "cts = collections.Counter()\n",
1484 "for w in present_words:\n",
1485 " _, r, c, d = present(grid, w)\n",
1486 " inds = indices(grid, r, c, len(w), d)\n",
1487 " for i in inds:\n",
1488 " cts[i] += 1\n",
1489 " print(w, r, c, len(w), d, inds)\n",
1490 "[i for i in cts if cts[i] > 1]"
1491 ]
1492 },
1493 {
1494 "cell_type": "code",
1495 "execution_count": 33,
1496 "metadata": {},
1497 "outputs": [
1498 {
1499 "name": "stdout",
1500 "output_type": "stream",
1501 "text": [
1502 "\n",
1503 "example-wordsearch.txt\n",
1504 "14 words present\n",
1505 "Longest word present: cowhides, 8 letters (['cowhides'])\n",
1506 "Longest word absent: molested, 8 letters (['adapting', 'coccyxes', 'crumbled', 'molested'])\n",
1507 "27 unused letters\n",
1508 "Longest makeable word: shinier, 7 (['shinier'])\n"
1509 ]
1510 }
1511 ],
1512 "source": [
1513 "do_wordsearch_tasks('example-wordsearch.txt', show_anyway=True)"
1514 ]
1515 },
1516 {
1517 "cell_type": "code",
1518 "execution_count": 34,
1519 "metadata": {},
1520 "outputs": [
1521 {
1522 "name": "stdout",
1523 "output_type": "stream",
1524 "text": [
1525 "..........\n",
1526 "..........\n",
1527 "....l.....\n",
1528 ".....i....\n",
1529 "......v...\n",
1530 ".......e..\n",
1531 "........d.\n",
1532 "..........\n",
1533 "..........\n",
1534 "..........\n"
1535 ]
1536 }
1537 ],
1538 "source": [
1539 "g = empty_grid(10, 10)\n",
1540 "set_grid(g, 2, 4, Direction.downright, 'lived')\n",
1541 "print(show_grid(g))"
1542 ]
1543 },
1544 {
1545 "cell_type": "code",
1546 "execution_count": 35,
1547 "metadata": {},
1548 "outputs": [
1549 {
1550 "name": "stdout",
1551 "output_type": "stream",
1552 "text": [
1553 "..........\n",
1554 ".......e..\n",
1555 "....l.d...\n",
1556 ".....i....\n",
1557 "....w.v...\n",
1558 ".......e..\n",
1559 "........d.\n",
1560 "..........\n",
1561 "..........\n",
1562 ".staple...\n"
1563 ]
1564 }
1565 ],
1566 "source": [
1567 "g = empty_grid(10, 10)\n",
1568 "set_grid(g, 2, 4, Direction.downright, 'lived')\n",
1569 "set_grid(g, 4, 4, Direction.upright, 'wide')\n",
1570 "set_grid(g, 9, 1, Direction.right, 'staple')\n",
1571 "print(show_grid(g))"
1572 ]
1573 },
1574 {
1575 "cell_type": "code",
1576 "execution_count": 36,
1577 "metadata": {},
1578 "outputs": [
1579 {
1580 "name": "stdout",
1581 "output_type": "stream",
1582 "text": [
1583 "..........\n",
1584 "...s......\n",
1585 "...e......\n",
1586 "...d......\n",
1587 "...i......\n",
1588 "...h......\n",
1589 "...w......\n",
1590 "...o......\n",
1591 "...c......\n",
1592 "..........\n"
1593 ]
1594 }
1595 ],
1596 "source": [
1597 "g = empty_grid(10, 10)\n",
1598 "set_grid(g, 8, 3, Direction.up, 'cowhides')\n",
1599 "print(show_grid(g))"
1600 ]
1601 },
1602 {
1603 "cell_type": "code",
1604 "execution_count": 39,
1605 "metadata": {},
1606 "outputs": [
1607 {
1608 "name": "stdout",
1609 "output_type": "stream",
1610 "text": [
1611 "..........\n",
1612 "...s...e..\n",
1613 "...el.d...\n",
1614 "...d.i....\n",
1615 "...iw.v...\n",
1616 "...h...e..\n",
1617 "...w....d.\n",
1618 "...o......\n",
1619 "...c......\n",
1620 ".staple...\n"
1621 ]
1622 }
1623 ],
1624 "source": [
1625 "g = empty_grid(10, 10)\n",
1626 "set_grid(g, 2, 4, Direction.downright, 'lived')\n",
1627 "set_grid(g, 4, 4, Direction.upright, 'wide')\n",
1628 "set_grid(g, 9, 1, Direction.right, 'staple')\n",
1629 "set_grid(g, 8, 3, Direction.up, 'cowhides')\n",
1630 "print(show_grid(g))"
1631 ]
1632 },
1633 {
1634 "cell_type": "code",
1635 "execution_count": 37,
1636 "metadata": {},
1637 "outputs": [
1638 {
1639 "name": "stdout",
1640 "output_type": "stream",
1641 "text": [
1642 "..........\n",
1643 "..........\n",
1644 "..........\n",
1645 "..........\n",
1646 "..........\n",
1647 "..........\n",
1648 "brow......\n",
1649 "..........\n",
1650 "..........\n",
1651 "..........\n"
1652 ]
1653 }
1654 ],
1655 "source": [
1656 "# Example of word in grid that is English but isn't in the words listed in the puzzle.\n",
1657 "g = empty_grid(10, 10)\n",
1658 "set_grid(g, 6, 0, Direction.right, 'brow')\n",
1659 "print(show_grid(g))"
1660 ]
1661 },
1662 {
1663 "cell_type": "code",
1664 "execution_count": 38,
1665 "metadata": {},
1666 "outputs": [
1667 {
1668 "name": "stdout",
1669 "output_type": "stream",
1670 "text": [
1671 "fhj.a....q\n",
1672 "w....uq..v\n",
1673 "i.a....h..\n",
1674 "..e....i..\n",
1675 "..........\n",
1676 "....a.....\n",
1677 "....p.f...\n",
1678 "........s.\n",
1679 ".i..h.npn.\n",
1680 "b......okr\n"
1681 ]
1682 },
1683 {
1684 "data": {
1685 "text/plain": [
1686 "'aaabeffhhhiiijknnoppqqrsuvw'"
1687 ]
1688 },
1689 "execution_count": 38,
1690 "metadata": {},
1691 "output_type": "execute_result"
1692 }
1693 ],
1694 "source": [
1695 "unused_grid = copy.deepcopy(padded_grid)\n",
1696 "for w in present_words:\n",
1697 " _, r, c, d = present(grid, w)\n",
1698 " set_grid(unused_grid, r, c, d, '.' * len(w))\n",
1699 "print(show_grid(unused_grid))\n",
1700 "cat(sorted(c for l in unused_grid for c in l if c in string.ascii_letters))"
1701 ]
1702 },
1703 {
1704 "cell_type": "code",
1705 "execution_count": 201,
1706 "metadata": {},
1707 "outputs": [
1708 {
1709 "data": {
1710 "text/plain": [
1711 "9"
1712 ]
1713 },
1714 "execution_count": 201,
1715 "metadata": {},
1716 "output_type": "execute_result"
1717 }
1718 ],
1719 "source": [
1720 "len(sorted(c for l in unused_grid for c in l if c in 'aeiou'))"
1721 ]
1722 },
1723 {
1724 "cell_type": "markdown",
1725 "metadata": {},
1726 "source": [
1727 "# All wordsearch puzzles"
1728 ]
1729 },
1730 {
1731 "cell_type": "code",
1732 "execution_count": 108,
1733 "metadata": {
1734 "collapsed": true
1735 },
1736 "outputs": [],
1737 "source": [
1738 "def read_all_wordsearch(fn):\n",
1739 " with open(fn) as f:\n",
1740 " text = f.read().strip()\n",
1741 " puzzles_text = text.split('\\n\\n')\n",
1742 " puzzles = []\n",
1743 " for p in puzzles_text:\n",
1744 " lines = p.splitlines()\n",
1745 " w, h = [int(s) for s in lines[0].split('x')]\n",
1746 " grid = lines[1:h+1]\n",
1747 " words = lines[h+1:]\n",
1748 " puzzles += [(w, h, grid, words)]\n",
1749 " return puzzles"
1750 ]
1751 },
1752 {
1753 "cell_type": "code",
1754 "execution_count": 109,
1755 "metadata": {},
1756 "outputs": [
1757 {
1758 "data": {
1759 "text/plain": [
1760 "(20,\n",
1761 " 20,\n",
1762 " ['esaetarotcetorpwvnma',\n",
1763 " 'dhuejswellraqzassuzn',\n",
1764 " 'riopstewsidftvenlnlo',\n",
1765 " 'dltnpupodiafilgeenap',\n",
1766 " 'yeooooosvconsgatvenm',\n",
1767 " 'tgtonrsdtpgsungsireo',\n",
1768 " 'csmtnlaistdklisaeyrp',\n",
1769 " 'fguuusortrewfkrfdluo',\n",
1770 " 'dotcnpvoyiraicsrieht',\n",
1771 " 'drtcoataksogdaekcoyy',\n",
1772 " 'igoialcuneoneuasirvy',\n",
1773 " 'ajnzdpacoqrbsoshmgnt',\n",
1774 " 'mmsxeetcewussviipeei',\n",
1775 " 'esbrevrioprivilramsr',\n",
1776 " 'tsgerdvcvutesbtrrska',\n",
1777 " 'eyselimisapenheettcl',\n",
1778 " 'ryponacqcetsdsddiouu',\n",
1779 " 'streitsaotsedalaanlg',\n",
1780 " 'foretselppusdfrsleae',\n",
1781 " 'utsolacebeardingpher'],\n",
1782 " ['aboveboard',\n",
1783 " 'accents',\n",
1784 " 'applicants',\n",
1785 " 'arbitrarily',\n",
1786 " 'atlas',\n",
1787 " 'bazillions',\n",
1788 " 'bearding',\n",
1789 " 'biathlon',\n",
1790 " 'bivouacking',\n",
1791 " 'canopy',\n",
1792 " 'captivated',\n",
1793 " 'chicory',\n",
1794 " 'cockroach',\n",
1795 " 'construct',\n",
1796 " 'coups',\n",
1797 " 'credit',\n",
1798 " 'depreciates',\n",
1799 " 'diameters',\n",
1800 " 'diffuses',\n",
1801 " 'douse',\n",
1802 " 'downbeats',\n",
1803 " 'dregs',\n",
1804 " 'envy',\n",
1805 " 'expects',\n",
1806 " 'expurgations',\n",
1807 " 'fact',\n",
1808 " 'fastens',\n",
1809 " 'festively',\n",
1810 " 'flubbing',\n",
1811 " 'fops',\n",
1812 " 'fore',\n",
1813 " 'gage',\n",
1814 " 'gapes',\n",
1815 " 'gawks',\n",
1816 " 'gemstone',\n",
1817 " 'grog',\n",
1818 " 'grossly',\n",
1819 " 'handlebar',\n",
1820 " 'haranguing',\n",
1821 " 'honorary',\n",
1822 " 'hulls',\n",
1823 " 'impartial',\n",
1824 " 'insists',\n",
1825 " 'lades',\n",
1826 " 'lane',\n",
1827 " 'levied',\n",
1828 " 'loaned',\n",
1829 " 'locust',\n",
1830 " 'loons',\n",
1831 " 'lucks',\n",
1832 " 'lying',\n",
1833 " 'memoir',\n",
1834 " 'methods',\n",
1835 " 'mutton',\n",
1836 " 'nodular',\n",
1837 " 'nunnery',\n",
1838 " 'onlookers',\n",
1839 " 'outputted',\n",
1840 " 'overhearing',\n",
1841 " 'panicky',\n",
1842 " 'particularly',\n",
1843 " 'peeving',\n",
1844 " 'podia',\n",
1845 " 'pompon',\n",
1846 " 'presenting',\n",
1847 " 'protectorate',\n",
1848 " 'pummels',\n",
1849 " 'ransoms',\n",
1850 " 'regularity',\n",
1851 " 'roof',\n",
1852 " 'salvaged',\n",
1853 " 'scanting',\n",
1854 " 'scions',\n",
1855 " 'shipping',\n",
1856 " 'shirred',\n",
1857 " 'silted',\n",
1858 " 'similes',\n",
1859 " 'smartened',\n",
1860 " 'snicker',\n",
1861 " 'snowdrops',\n",
1862 " 'sobs',\n",
1863 " 'solace',\n",
1864 " 'stews',\n",
1865 " 'stitches',\n",
1866 " 'sulfides',\n",
1867 " 'supplest',\n",
1868 " 'suppositions',\n",
1869 " 'swell',\n",
1870 " 'theirs',\n",
1871 " 'toastier',\n",
1872 " 'tutorial',\n",
1873 " 'unaccepted',\n",
1874 " 'unionising',\n",
1875 " 'vanquish',\n",
1876 " 'venous',\n",
1877 " 'verbs',\n",
1878 " 'vitiation',\n",
1879 " 'waving',\n",
1880 " 'wrens',\n",
1881 " 'yock'])"
1882 ]
1883 },
1884 "execution_count": 109,
1885 "metadata": {},
1886 "output_type": "execute_result"
1887 }
1888 ],
1889 "source": [
1890 "puzzles = read_all_wordsearch('all-wordsearches.txt')\n",
1891 "puzzles[3]"
1892 ]
1893 },
1894 {
1895 "cell_type": "code",
1896 "execution_count": 110,
1897 "metadata": {
1898 "collapsed": true
1899 },
1900 "outputs": [],
1901 "source": [
1902 "def found_words_length(puzzle):\n",
1903 " width, height, grid, words = puzzle\n",
1904 " return sum(len(p[0]) for p in present_many(grid, words))"
1905 ]
1906 },
1907 {
1908 "cell_type": "code",
1909 "execution_count": 113,
1910 "metadata": {
1911 "scrolled": true
1912 },
1913 "outputs": [
1914 {
1915 "data": {
1916 "text/plain": [
1917 "[309,\n",
1918 " 335,\n",
1919 " 338,\n",
1920 " 346,\n",
1921 " 364,\n",
1922 " 372,\n",
1923 " 337,\n",
1924 " 340,\n",
1925 " 353,\n",
1926 " 371,\n",
1927 " 343,\n",
1928 " 348,\n",
1929 " 375,\n",
1930 " 343,\n",
1931 " 363,\n",
1932 " 376,\n",
1933 " 347,\n",
1934 " 363,\n",
1935 " 342,\n",
1936 " 348,\n",
1937 " 377,\n",
1938 " 342,\n",
1939 " 355,\n",
1940 " 351,\n",
1941 " 342,\n",
1942 " 331,\n",
1943 " 362,\n",
1944 " 354,\n",
1945 " 323,\n",
1946 " 353,\n",
1947 " 337,\n",
1948 " 340,\n",
1949 " 349,\n",
1950 " 362,\n",
1951 " 361,\n",
1952 " 350,\n",
1953 " 348,\n",
1954 " 327,\n",
1955 " 370,\n",
1956 " 362,\n",
1957 " 334,\n",
1958 " 333,\n",
1959 " 341,\n",
1960 " 354,\n",
1961 " 355,\n",
1962 " 358,\n",
1963 " 355,\n",
1964 " 358,\n",
1965 " 357,\n",
1966 " 351,\n",
1967 " 351,\n",
1968 " 346,\n",
1969 " 326,\n",
1970 " 332,\n",
1971 " 352,\n",
1972 " 347,\n",
1973 " 346,\n",
1974 " 369,\n",
1975 " 363,\n",
1976 " 361,\n",
1977 " 365,\n",
1978 " 364,\n",
1979 " 359,\n",
1980 " 352,\n",
1981 " 344,\n",
1982 " 352,\n",
1983 " 348,\n",
1984 " 360,\n",
1985 " 333,\n",
1986 " 352,\n",
1987 " 374,\n",
1988 " 360,\n",
1989 " 349,\n",
1990 " 343,\n",
1991 " 360,\n",
1992 " 345,\n",
1993 " 364,\n",
1994 " 355,\n",
1995 " 349,\n",
1996 " 349,\n",
1997 " 355,\n",
1998 " 328,\n",
1999 " 358,\n",
2000 " 344,\n",
2001 " 335,\n",
2002 " 339,\n",
2003 " 365,\n",
2004 " 328,\n",
2005 " 343,\n",
2006 " 350,\n",
2007 " 340,\n",
2008 " 342,\n",
2009 " 342,\n",
2010 " 357,\n",
2011 " 362,\n",
2012 " 333,\n",
2013 " 357,\n",
2014 " 346,\n",
2015 " 341,\n",
2016 " 348]"
2017 ]
2018 },
2019 "execution_count": 113,
2020 "metadata": {},
2021 "output_type": "execute_result"
2022 }
2023 ],
2024 "source": [
2025 "[found_words_length(p) for p in puzzles]"
2026 ]
2027 },
2028 {
2029 "cell_type": "code",
2030 "execution_count": 114,
2031 "metadata": {},
2032 "outputs": [
2033 {
2034 "data": {
2035 "text/plain": [
2036 "34988"
2037 ]
2038 },
2039 "execution_count": 114,
2040 "metadata": {},
2041 "output_type": "execute_result"
2042 }
2043 ],
2044 "source": [
2045 "sum(found_words_length(p) for p in puzzles)"
2046 ]
2047 },
2048 {
2049 "cell_type": "code",
2050 "execution_count": 122,
2051 "metadata": {
2052 "collapsed": true
2053 },
2054 "outputs": [],
2055 "source": [
2056 "def max_unfound_word_length(puzzle):\n",
2057 " width, height, grid, words = puzzle\n",
2058 " presences = present_many(grid, words)\n",
2059 " used_words = [p[0] for p in presences]\n",
2060 " unused_words = [w for w in words if w not in used_words]\n",
2061 " \n",
2062 " unused_grid = [[c for c in r] for r in grid]\n",
2063 " for w, r, c, d in presences:\n",
2064 " set_grid(unused_grid, r, c, d, '.' * len(w))\n",
2065 " unused_letters = [c for l in unused_grid for c in l if c != '.']\n",
2066 " unused_letter_count = collections.Counter(unused_letters)\n",
2067 " \n",
2068 " makeable_words = []\n",
2069 " for w in unused_words:\n",
2070 " unused_word_count = collections.Counter(w)\n",
2071 " if all(unused_word_count[l] <= unused_letter_count[l] for l in unused_word_count):\n",
2072 " makeable_words += [w]\n",
2073 " lwm = max(len(w) for w in makeable_words)\n",
2074 " return lwm"
2075 ]
2076 },
2077 {
2078 "cell_type": "code",
2079 "execution_count": 123,
2080 "metadata": {
2081 "scrolled": true
2082 },
2083 "outputs": [
2084 {
2085 "data": {
2086 "text/plain": [
2087 "[11,\n",
2088 " 13,\n",
2089 " 11,\n",
2090 " 11,\n",
2091 " 10,\n",
2092 " 9,\n",
2093 " 12,\n",
2094 " 12,\n",
2095 " 11,\n",
2096 " 10,\n",
2097 " 15,\n",
2098 " 11,\n",
2099 " 10,\n",
2100 " 11,\n",
2101 " 10,\n",
2102 " 12,\n",
2103 " 11,\n",
2104 " 9,\n",
2105 " 11,\n",
2106 " 11,\n",
2107 " 10,\n",
2108 " 12,\n",
2109 " 11,\n",
2110 " 12,\n",
2111 " 13,\n",
2112 " 12,\n",
2113 " 10,\n",
2114 " 10,\n",
2115 " 11,\n",
2116 " 9,\n",
2117 " 11,\n",
2118 " 11,\n",
2119 " 8,\n",
2120 " 12,\n",
2121 " 13,\n",
2122 " 10,\n",
2123 " 11,\n",
2124 " 11,\n",
2125 " 9,\n",
2126 " 8,\n",
2127 " 12,\n",
2128 " 13,\n",
2129 " 11,\n",
2130 " 9,\n",
2131 " 13,\n",
2132 " 11,\n",
2133 " 11,\n",
2134 " 11,\n",
2135 " 13,\n",
2136 " 12,\n",
2137 " 10,\n",
2138 " 11,\n",
2139 " 11,\n",
2140 " 12,\n",
2141 " 10,\n",
2142 " 8,\n",
2143 " 12,\n",
2144 " 11,\n",
2145 " 10,\n",
2146 " 11,\n",
2147 " 8,\n",
2148 " 12,\n",
2149 " 10,\n",
2150 " 11,\n",
2151 " 12,\n",
2152 " 12,\n",
2153 " 12,\n",
2154 " 12,\n",
2155 " 11,\n",
2156 " 11,\n",
2157 " 12,\n",
2158 " 12,\n",
2159 " 13,\n",
2160 " 10,\n",
2161 " 10,\n",
2162 " 10,\n",
2163 " 10,\n",
2164 " 12,\n",
2165 " 11,\n",
2166 " 11,\n",
2167 " 12,\n",
2168 " 11,\n",
2169 " 9,\n",
2170 " 14,\n",
2171 " 11,\n",
2172 " 13,\n",
2173 " 12,\n",
2174 " 10,\n",
2175 " 10,\n",
2176 " 12,\n",
2177 " 13,\n",
2178 " 10,\n",
2179 " 10,\n",
2180 " 12,\n",
2181 " 11,\n",
2182 " 12,\n",
2183 " 11,\n",
2184 " 11,\n",
2185 " 12,\n",
2186 " 11]"
2187 ]
2188 },
2189 "execution_count": 123,
2190 "metadata": {},
2191 "output_type": "execute_result"
2192 }
2193 ],
2194 "source": [
2195 "[max_unfound_word_length(p) for p in puzzles]"
2196 ]
2197 },
2198 {
2199 "cell_type": "code",
2200 "execution_count": 121,
2201 "metadata": {},
2202 "outputs": [
2203 {
2204 "data": {
2205 "text/plain": [
2206 "1106"
2207 ]
2208 },
2209 "execution_count": 121,
2210 "metadata": {},
2211 "output_type": "execute_result"
2212 }
2213 ],
2214 "source": [
2215 "sum(max_unfound_word_length(p) for p in puzzles)"
2216 ]
2217 },
2218 {
2219 "cell_type": "code",
2220 "execution_count": 124,
2221 "metadata": {},
2222 "outputs": [
2223 {
2224 "name": "stdout",
2225 "output_type": "stream",
2226 "text": [
2227 "1 loop, best of 3: 18.8 s per loop\n"
2228 ]
2229 }
2230 ],
2231 "source": [
2232 "%%timeit\n",
2233 "sum(found_words_length(p) for p in puzzles)"
2234 ]
2235 },
2236 {
2237 "cell_type": "code",
2238 "execution_count": 125,
2239 "metadata": {},
2240 "outputs": [
2241 {
2242 "name": "stdout",
2243 "output_type": "stream",
2244 "text": [
2245 "1 loop, best of 3: 18.7 s per loop\n"
2246 ]
2247 }
2248 ],
2249 "source": [
2250 "%%timeit\n",
2251 "sum(max_unfound_word_length(p) for p in puzzles)"
2252 ]
2253 },
2254 {
2255 "cell_type": "code",
2256 "execution_count": 190,
2257 "metadata": {
2258 "scrolled": true
2259 },
2260 "outputs": [
2261 {
2262 "name": "stdout",
2263 "output_type": "stream",
2264 "text": [
2265 "\n",
2266 "huge-wordsearch.txt\n",
2267 "1149 words present\n",
2268 "Longest word present: hypersensitivities, 18 letters (['hypersensitivities'])\n",
2269 "Longest word absent: rambunctiousness, 16 letters (['rambunctiousness'])\n",
2270 "57 unused letters\n",
2271 "Longest makeable word: rambunctiousness, 16 (['rambunctiousness'])\n"
2272 ]
2273 }
2274 ],
2275 "source": [
2276 "do_wordsearch_tasks('huge-wordsearch.txt', show_anyway=True)"
2277 ]
2278 },
2279 {
2280 "cell_type": "code",
2281 "execution_count": 191,
2282 "metadata": {},
2283 "outputs": [
2284 {
2285 "name": "stdout",
2286 "output_type": "stream",
2287 "text": [
2288 "1 loop, best of 3: 1min 4s per loop\n"
2289 ]
2290 }
2291 ],
2292 "source": [
2293 "%%timeit\n",
2294 "do_wordsearch_tasks('huge-wordsearch.txt')"
2295 ]
2296 },
2297 {
2298 "cell_type": "code",
2299 "execution_count": 192,
2300 "metadata": {
2301 "collapsed": true
2302 },
2303 "outputs": [],
2304 "source": [
2305 "width, height, grid, words = read_wordsearch('huge-wordsearch.txt')\n",
2306 "pm = present_many(grid, words)\n",
2307 "pold = [w for w in words if present(grid, w)[0]]"
2308 ]
2309 },
2310 {
2311 "cell_type": "code",
2312 "execution_count": 193,
2313 "metadata": {},
2314 "outputs": [
2315 {
2316 "data": {
2317 "text/plain": [
2318 "1149"
2319 ]
2320 },
2321 "execution_count": 193,
2322 "metadata": {},
2323 "output_type": "execute_result"
2324 }
2325 ],
2326 "source": [
2327 "len(pm)"
2328 ]
2329 },
2330 {
2331 "cell_type": "code",
2332 "execution_count": 194,
2333 "metadata": {},
2334 "outputs": [
2335 {
2336 "data": {
2337 "text/plain": [
2338 "1149"
2339 ]
2340 },
2341 "execution_count": 194,
2342 "metadata": {},
2343 "output_type": "execute_result"
2344 }
2345 ],
2346 "source": [
2347 "len(pold)"
2348 ]
2349 },
2350 {
2351 "cell_type": "code",
2352 "execution_count": 195,
2353 "metadata": {},
2354 "outputs": [
2355 {
2356 "data": {
2357 "text/plain": [
2358 "1149"
2359 ]
2360 },
2361 "execution_count": 195,
2362 "metadata": {},
2363 "output_type": "execute_result"
2364 }
2365 ],
2366 "source": [
2367 "pm_extra = [p for p in pm if p not in pold]\n",
2368 "len(pm_extra)"
2369 ]
2370 },
2371 {
2372 "cell_type": "code",
2373 "execution_count": 196,
2374 "metadata": {},
2375 "outputs": [
2376 {
2377 "data": {
2378 "text/plain": [
2379 "[('poltroons', 62, 65, <Direction.downleft: 7>),\n",
2380 " ('dogged', 7, 45, <Direction.down: 4>),\n",
2381 " ('activist', 51, 35, <Direction.downright: 8>)]"
2382 ]
2383 },
2384 "execution_count": 196,
2385 "metadata": {},
2386 "output_type": "execute_result"
2387 }
2388 ],
2389 "source": [
2390 "list(pm)[:3]"
2391 ]
2392 },
2393 {
2394 "cell_type": "code",
2395 "execution_count": 197,
2396 "metadata": {},
2397 "outputs": [
2398 {
2399 "data": {
2400 "text/plain": [
2401 "['abound', 'abstracted', 'accidents']"
2402 ]
2403 },
2404 "execution_count": 197,
2405 "metadata": {},
2406 "output_type": "execute_result"
2407 }
2408 ],
2409 "source": [
2410 "pold[:3]"
2411 ]
2412 },
2413 {
2414 "cell_type": "code",
2415 "execution_count": 198,
2416 "metadata": {},
2417 "outputs": [
2418 {
2419 "data": {
2420 "text/plain": [
2421 "[('retreading', 1),\n",
2422 " ('disavows', 1),\n",
2423 " ('finals', 1),\n",
2424 " ('conniver', 1),\n",
2425 " ('warding', 1),\n",
2426 " ('melon', 1),\n",
2427 " ('paging', 1),\n",
2428 " ('booties', 1),\n",
2429 " ('civilises', 1),\n",
2430 " ('hugged', 1)]"
2431 ]
2432 },
2433 "execution_count": 198,
2434 "metadata": {},
2435 "output_type": "execute_result"
2436 }
2437 ],
2438 "source": [
2439 "collections.Counter(p[0] for p in pm).most_common(10)"
2440 ]
2441 },
2442 {
2443 "cell_type": "code",
2444 "execution_count": 199,
2445 "metadata": {},
2446 "outputs": [
2447 {
2448 "data": {
2449 "text/plain": [
2450 "[]"
2451 ]
2452 },
2453 "execution_count": 199,
2454 "metadata": {},
2455 "output_type": "execute_result"
2456 }
2457 ],
2458 "source": [
2459 "[p for p in pm if p[0] == 'seen']"
2460 ]
2461 },
2462 {
2463 "cell_type": "code",
2464 "execution_count": null,
2465 "metadata": {
2466 "collapsed": true
2467 },
2468 "outputs": [],
2469 "source": []
2470 }
2471 ],
2472 "metadata": {
2473 "kernelspec": {
2474 "display_name": "Python 3",
2475 "language": "python",
2476 "name": "python3"
2477 },
2478 "language_info": {
2479 "codemirror_mode": {
2480 "name": "ipython",
2481 "version": 3
2482 },
2483 "file_extension": ".py",
2484 "mimetype": "text/x-python",
2485 "name": "python",
2486 "nbconvert_exporter": "python",
2487 "pygments_lexer": "ipython3",
2488 "version": "3.5.2+"
2489 }
2490 },
2491 "nbformat": 4,
2492 "nbformat_minor": 1
2493 }