Skip to content

Commit 239a0ff

Browse files
committed
file 10
1 parent a1f1e50 commit 239a0ff

File tree

2 files changed

+33
-1
lines changed

2 files changed

+33
-1
lines changed

10_find_files_recursively.py

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
import fnmatch
2+
import os
3+
4+
# constants
5+
PATH = '/../../../..'
6+
PATTERN = '*.py'
7+
8+
9+
def get_file_names(filepath, pattern):
10+
matches = []
11+
if os.path.exists(filepath):
12+
for root, dirnames, filenames in os.walk(filepath):
13+
for filename in fnmatch.filter(filenames, pattern):
14+
# matches.append(os.path.join(root, filename)) # full path
15+
matches.append(os.path.join(filename)) # just file name
16+
if matches:
17+
print "Found {} files:".format(len(matches))
18+
output_files(matches)
19+
else:
20+
print "No files found."
21+
else:
22+
print "Sorry that path does not exist. Try again."
23+
24+
25+
def output_files(list_of_files):
26+
for filename in list_of_files:
27+
print filename
28+
29+
30+
if __name__ == '__main__':
31+
all_files = get_file_names(PATH, PATTERN)

readme.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,4 +8,5 @@
88
1. **06_execution_time.py**: class used for timing execution of code
99
1. **07_benchmark_permissions_loading_django.py**: benchmark loading of permissions in Django
1010
1. **08_basic_email_web_crawler.py**: web crawler for grabbing emails from a website recursively
11-
1. **08_basic_link_web_crawler.py**: web crawler for grabbing links from a website recursively
11+
1. **09_basic_link_web_crawler.py**: web crawler for grabbing links from a website recursively
12+
1. **10_find_files_recursively.py**: recursively grab files from a directory

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy