0% found this document useful (0 votes)
79 views40 pages

Fundamentals of Deep Learning Nikhil Buduma PDF Download

The document provides information about the book 'Fundamentals of Deep Learning' by Nithin Buduma, Nikhil Buduma, and Joe Papa, including its second edition release and focus on deep learning concepts. It aims to bridge the gap in understanding deep learning through rigorous mathematical background and practical applications using the PyTorch library. Additionally, the document includes links to download the book and other related resources.

Uploaded by

miscositek14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views40 pages

Fundamentals of Deep Learning Nikhil Buduma PDF Download

The document provides information about the book 'Fundamentals of Deep Learning' by Nithin Buduma, Nikhil Buduma, and Joe Papa, including its second edition release and focus on deep learning concepts. It aims to bridge the gap in understanding deep learning through rigorous mathematical background and practical applications using the PyTorch library. Additionally, the document includes links to download the book and other related resources.

Uploaded by

miscositek14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Fundamentals of Deep Learning Nikhil Buduma pdf

download

https://ebookmeta.com/product/fundamentals-of-deep-learning-
nikhil-buduma/

Download more ebook from https://ebookmeta.com


We believe these products will be a great fit for you. Click
the link to download now, or visit ebookmeta.com
to discover even more!

Fundamentals of Deep Learning, 2nd Edition (Early


Release) Nithin Buduma

https://ebookmeta.com/product/fundamentals-of-deep-learning-2nd-
edition-early-release-nithin-buduma/

Deep Learning with Python 1st Edition Nikhil Ketkar

https://ebookmeta.com/product/deep-learning-with-python-1st-
edition-nikhil-ketkar/

Deep Learning with Python 2nd Edition Nikhil Ketkar

https://ebookmeta.com/product/deep-learning-with-python-2nd-
edition-nikhil-ketkar/

A Companion to the Hanseatic League 1st Edition Donald


Harreld

https://ebookmeta.com/product/a-companion-to-the-hanseatic-
league-1st-edition-donald-harreld/
Benjamin Franklin in Scotland and Ireland 1759 and 1771
J. Bennett Nolan

https://ebookmeta.com/product/benjamin-franklin-in-scotland-and-
ireland-1759-and-1771-j-bennett-nolan/

Kathy Acker Punk Writer 1st Edition Henderson Margaret

https://ebookmeta.com/product/kathy-acker-punk-writer-1st-
edition-henderson-margaret/

Expecting the Playboy s Baby 1st Edition Sam Crescent

https://ebookmeta.com/product/expecting-the-playboy-s-baby-1st-
edition-sam-crescent/

Unravelling Sustainability and Resilience in the Built


Environment 1st Edition Emilio Jose Garcia And Brenda
Vale

https://ebookmeta.com/product/unravelling-sustainability-and-
resilience-in-the-built-environment-1st-edition-emilio-jose-
garcia-and-brenda-vale/

Managing Performance through Training and Development


8th Edition Alan Saks

https://ebookmeta.com/product/managing-performance-through-
training-and-development-8th-edition-alan-saks/
António Vieira Six Sermons António Vieira

https://ebookmeta.com/product/antonio-vieira-six-sermons-antonio-
vieira/
Fundamentals of Deep
Learning
SECOND EDITION

Designing Next-Generation Machine Intelligence


Algorithms

Nithin Buduma, Nikhil Buduma, and Joe Papa


with contributions by Nicholas Locascio
Fundamentals of Deep Learning
by Nithin Buduma, Nikhil Buduma, and Joe Papa
Copyright © 2022 Nithin Buduma and Mobile Insights Technology
Group, LLC. All rights reserved.
Printed in the United States of America.
Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North,
Sebastopol, CA 95472.
O’Reilly books may be purchased for educational, business, or sales
promotional use. Online editions are also available for most titles
(http://oreilly.com). For more information, contact our
corporate/institutional sales department: 800-998-9938 or
corporate@oreilly.com.

Acquisitions Editor: Rebecca Novack

Development Editor: Melissa Potter

Production Editor: Katherine Tozer

Copyeditor: Sonia Saruba

Proofreader: Stephanie English

Indexer: Judith McConville

Interior Designer: David Futato

Cover Designer: Karen Montgomery

Illustrator: Kate Dullea

June 2017: First Edition


May 2022: Second Edition
Revision History for the Second Edition
2022-05-16: First Release

See http://oreilly.com/catalog/errata.csp?isbn=9781492082187 for


release details.
The O’Reilly logo is a registered trademark of O’Reilly Media, Inc.
Fundamentals of Deep Learning, the cover image, and related trade
dress are trademarks of O’Reilly Media, Inc.
While the publisher and the authors have used good faith efforts to
ensure that the information and instructions contained in this work
are accurate, the publisher and the authors disclaim all responsibility
for errors or omissions, including without limitation responsibility for
damages resulting from the use of or reliance on this work. Use of
the information and instructions contained in this work is at your
own risk. If any code samples or other technology this work contains
or describes is subject to open source licenses or the intellectual
property rights of others, it is your responsibility to ensure that your
use thereof complies with such licenses and/or rights.
978-1-492-08218-7
LSI
Preface

With the reinvigoration of neural networks in the 2000s, deep


learning has become an extremely active area of research that is
paving the way for modern machine learning. This book uses
exposition and examples to help you understand major concepts in
this complicated field. Large companies such as Google, Microsoft,
and Facebook have taken notice and are actively growing in-house
deep learning teams. For the rest of us, deep learning is still a pretty
complex and difficult subject to grasp. Research papers are filled to
the brim with jargon, and scattered online tutorials do little to help
build a strong intuition for why and how deep learning practitioners
approach problems. Our goal is to bridge this gap.
In this second edition, we provide more rigorous background
sections in mathematics with the aim of better equipping you for the
material in the rest of the book. In addition, we have updated
chapters in sequence analysis, computer vision, and reinforcement
learning with deep dives into the latest advancements in the fields.
And finally, we have added new chapters in the fields of generative
modeling and interpretability to provide you with a broader view of
the field of deep learning. We hope that these updates inspire you to
practice deep learning on their own and apply their learnings to
solve meaningful problems in the real world.

Prerequisites and Objectives


This book is aimed at an audience with a basic operating
understanding of calculus and Python programming. In this latest
edition, we provide extensive mathematical background chapters,
specifically in linear algebra and probability, to prepare you for the
material that lies ahead.
By the end of the book, we hope you will be left with an intuition for
how to approach problems using deep learning, the historical
context for modern deep learning approaches, and a familiarity with
implementing deep learning algorithms using the PyTorch open
source library.
How Is This Book Organized?
The first chapters of this book are dedicated to developing
mathematical maturity via deep dives into linear algebra and
probability, which are deeply embedded in the field of deep learning.
The next several chapters discuss the structure of feed-forward
neural networks, how to implement them in code, and how to train
and evaluate them on real-world datasets. The rest of the book is
dedicated to specific applications of deep learning and
understanding the intuition behind the specialized learning
techniques and neural network architectures developed for those
applications. Although we cover advanced research in these latter
sections, we hope to provide a breakdown of these techniques that
is derived from first principles and digestible.

Conventions Used in This Book


The following typographical conventions are used in this book:
Italic
Indicates new terms, URLs, email addresses, filenames, and file
extensions.

Constant width
Used for program listings, as well as within paragraphs to refer to
program elements such as variable or function names, databases,
data types, environment variables, statements, and keywords.

NOTE
This element signifies a general note.
WARNING
This element indicates a warning or caution.

Using Code Examples


Supplemental material (code examples, exercises, etc.) is available
for download at https://github.com/darksigma/Fundamentals-of-
Deep-Learning-Book.
If you have a technical question or a problem using the code
examples, please email bookquestions@oreilly.com.
This book is here to help you get your job done. In general, if
example code is offered with this book, you may use it in your
programs and documentation. You do not need to contact us for
permission unless you’re reproducing a significant portion of the
code. For example, writing a program that uses several chunks of
code from this book does not require permission. Selling or
distributing examples from O’Reilly books does require permission.
Answering a question by citing this book and quoting example code
does not require permission. Incorporating a significant amount of
example code from this book into your product’s documentation
does require permission.
We appreciate, but do not require, attribution. An attribution usually
includes the title, author, publisher, and ISBN. For example:
“Fundamentals of Deep Learning by Nithin Buduma, Nikhil Buduma,
and Joe Papa (O’Reilly). Copyright 2022 Nithin Buduma and Mobile
Insights Technology Group, LLC, 978-1-492-08218-7.”
If you feel your use of code examples falls outside fair use or the
permission given above, feel free to contact us at
permissions@oreilly.com.
O’Reilly Online Learning

NOTE
For more than 40 years, O’Reilly Media has provided technology and
business training, knowledge, and insight to help companies succeed.

Our unique network of experts and innovators share their knowledge


and expertise through books, articles, and our online learning
platform. O’Reilly’s online learning platform gives you on-demand
access to live training courses, in-depth learning paths, interactive
coding environments, and a vast collection of text and video from
O’Reilly and 200+ other publishers. For more information, visit
https://oreilly.com.

How to Contact Us
Please address comments and questions concerning this book to the
publisher:

O’Reilly Media, Inc.

1005 Gravenstein Highway North

Sebastopol, CA 95472

800-998-9938 (in the United States or Canada)

707-829-0515 (international or local)

707-829-0104 (fax)

We have a web page for this book, where we list errata, examples,
and any additional information. You can access this page at
https://oreil.ly/fundamentals-of-deep-learning-2e.
Email bookquestions@oreilly.com to comment or ask technical
questions about this book.
For news and information about our books and courses, visit
https://oreilly.com.
Find us on LinkedIn: https://www.linkedin.com/company/oreilly-
media.
Follow us on Twitter: https://twitter.com/oreillymedia.
Watch us on YouTube: https://www.youtube.com/oreillymedia.

Acknowledgements
We’d like to thank several people who have been instrumental in the
completion of this text. We’d like to start by acknowledging Mostafa
Samir and Surya Bhupatiraju, who contributed heavily to the content
of Chapters 7 and 8. We also appreciate the contributions of
Mohamed (Hassan) Kane and Anish Athalye, who worked on early
versions of the code examples in this book’s GitHub repository.

Nithin and Nikhil


This book would not have been possible without the never-ending
support and expertise of our editor, Shannon Cutt. We’d also like to
appreciate the commentary provided by our reviewers, Isaac Hodes,
David Andrzejewski, Aaron Schumacher, Vishwesh Ravi Shrimali,
Manjeet Dahiya, Ankur Patel, and Suneeta Mall, who provided
thoughtful, in-depth, and technical commentary on the original
drafts of the text. Finally, we are thankful for all of the insight
provided by our friends and family members, including Jeff Dean,
Venkat Buduma, William, and Jack, as we finalized the manuscript of
the text.
Joe
Updating the code for this book with PyTorch has been an enjoyable
and exciting experience. No endeavor like this can be achieved by
one person alone. First, I would like to thank the PyTorch community
and its 2,100+ contributors for continuing to grow and improve
PyTorch and its deep learning capabilities. It is because of you that
we can demonstrate the concepts described in this book.
I am forever grateful to Rebecca Novack for bringing me into this
project and for her confidence in me as an author. Many thanks to
Melissa Potter and the O’Reilly production staff in making this
updated version come to life.
For his encouragement and support, I’d like to thank Matt Kirk. He’s
been my rock through it all. Thank you for our countless chats full of
ideas and resources.
Special thanks to my kids, Savannah, Caroline, George, and Forrest,
for being patient and understanding when Daddy had to work. And,
most of all, thank you to my wife, Emily, who has always supported
my dreams throughout life. While I diligently wrote code, she cared
for our newborn through sleepless nights while ensuring the “big”
kids had their needs met too. Without her, my contributions to this
project would not be possible.
Chapter 1. Fundamentals of
Linear Algebra for Deep
Learning

In this chapter, we cover important prerequisite knowledge that will


motivate our discussion of deep learning techniques in the main text
and the optional sidebars at the end of select chapters. Deep
learning has recently experienced a renaissance, both in academic
research and in the industry. It has pushed the limits of machine
learning by leaps and bounds, revolutionizing fields such as
computer vision and natural language processing. However, it is
important to remember that deep learning is, at its core, a
culmination of achievements in fields such as calculus, linear
algebra, and probability. Although there are deeper connections to
other fields of mathematics, we focus on the three listed here to
help us broaden our perspective before diving into deep learning.
These fields are key to unlocking both the big picture of deep
learning and the intricate subtleties that make it as exciting as it is.
In this first chapter on background, we cover the fundamentals of
linear algebra.

Data Structures and Operations


The most important data structure in linear algebra (whenever we
reference linear algebra in this text, we refer to its applied variety) is
arguably the matrix, a 2D array of numbers where each entry can be
indexed via its row and column. Think of an Excel spreadsheet,
where you have offers from Company X and Company Y as two
rows, and the columns represent some characteristic of each offer,
such as starting salary, bonus, or position, as shown in Table 1-1.
T
a
b
l
e
1
-
1
.
E
x
c
e
l
s
p
r
e
a
d
s
h
e
e
t

Company X Company Y
Salary $50,000 $40,000
Bonus $5,000 $7,500
Position Engineer Data Scientist
The table format is especially suited to keep track of such data,
where you can index by row and column to find, for example,
Company X’s starting position. Matrices, similarly, are a multipurpose
tool to hold all kinds of data, where the data we work in this book is
of numerical form. In deep learning, matrices are often used to
represent both datasets and weights in a neural network. A dataset,
for example, has many individual data points with any number of
associated features. A lizard dataset might contain information on
length, weight, speed, age, and other important attributes. We can
represent this intuitively as a matrix or table, where each row
represents an individual lizard, and each column represents a lizard
feature, such as age. However, as opposed to Table 1-1, the matrix
stores only the numbers and assumes that the user has kept track of
which rows correspond to which data points, which columns
correspond to which feature, and what the units are for each
feature, as you can see in Figure 1-1.

Figure 1-1. A comparison of tables and matrices

On the right side, we have a matrix, where it’s assumed, for


example, that the age of each lizard is in years, and Komodo Ken
weighs a whopping 50 kilograms! But why even work with matrices
when tables clearly give the user more information? Well, in linear
algebra and even deep learning, operations such as multiplication
and addition are done on the tabular data itself, but such operations
can only be computed efficiently when the data is in solely numerical
format.
Much of the work in linear algebra centers on the emergent
properties of matrices, which are especially interesting when the
matrix has certain base attributes, and operations on these data
structures. Vectors, which can be seen as a subset type of matrices,
are a 1D array of numbers. This data structure can be used to
represent an individual data point or the weights in a linear
regression, for example. We cover properties of matrices and vectors
as well as operations on both.

Matrix Operations
Matrices can be added, subtracted, and multiplied—there is no
division of matrices, but there exists a similar concept called
inversion.
When indexing a matrix, we use a tuple, where the first index
represents the row number and the second index represents the
column number. To add two matrices A and B, one loops through
each index (i,j) of the two matrices, sums the two entries at the
current index, and places that result in the same index (i,j) of a new
matrix C, as can be seen in Figure 1-2.

Figure 1-2. Matrix addition

This algorithm implies that we can’t add two matrices of different


shapes, since indices that exist in one matrix wouldn’t exist in the
other. It also implies that the final matrix C is of the same shape as
A and B. In addition to adding matrices, we can multiply a matrix by
a scalar. This involves simply taking the scalar and multiplying each
of the entries of the matrix by it (the shape of the resultant matrix
stays constant), as depicted in Figure 1-3.

Figure 1-3. Scalar-matrix multiplication


Discovering Diverse Content Through
Random Scribd Documents
14 furi-kahashi, flutter their sleeves together.
17 yochi, of like age.
22 pass away.
24 ka is intensitive prefix.
25 at some time or other.
27 ni no ho, ruddy-ear (of grain) like.
39 shitsu kura, saddle of patterned &c. Japanese stuff.
40 mount and ride.
45 close-shut wooden doors.
47 grope.
49 tama-de, fine arms.
52 hand-supporting-staff.
58 = oyoso.
1-8 this fleeting world.
9-31 passing character of woman’s charms.
32-62 impermanence of man’s strength and joys.
For momokusani, shirotaheno, kurenawino, minanowata,
masurawono, matamadeno, tamakiharu, tokihanasu, see List m. k.

65

Yamanohe no Omi Okura ga Chinkwai-seki wo yomeru uta hitotsu.


Kakemaku ha 1
aya ni kashikoshi
Tarashi hime
kami no mikoto
Karakuni wo 5
muke-tahiragete
mi-kokoro wo
shidzume-tamafu to
i-torashite
ihahi-tamahishi 10
ma-tama nasu
futatsu no ishi wo
yo no hito ni
shimeshi-tamahite
yorodzu yo ni 15
ihi-tsugu to gane
watanosoko
oki tsu no Fukaye no
unakami no
Kofu no hara ni 20
mi tetsukara
okashi tamahite
kamu nagara
kamu sabi imasu
kushi mitama 25
ima no otsutsu ni
tafutoki ro ka mo!

16 gane = gani = yô.


17, 18 Epithetical of Fuka(ye).
19 umi no kami = umibe.
21 her own royal hand.
22 = oku.
25 kushi, wondrous; mitama, matama, right precious jewels, or
right-soul.
27 ro, see grammar.
For watanosoko see List m. k.

Maki V, Shimo
66

Tsukushi no michi no kuchi (Chikuzen) no mikoto mochi no kami


Yamanohe no Okura ga Kumagori ni kaharite sono kokorozashi wo
noburu uta ni tsutsushimite nazorafuru uta mutsu mata zho.
Uchihisasu 1
Miya he noboru to
tarachishino
haha ga te hanare
tsune shiranu 5
kuni no oku-ka wo
momo he yama
koyete sugi-yuki
itsushikamo
miyako wo mimu to 10
omohitsutsu
katarahi woredo
ono ga mi shi
itahashikereba
tamahokono 15
michi no kuma mi ni
kusa ta-wori
shiba tori-shikite
toko-zhi mono
uchi-koi-fushite 20
omohitsutsu
nageki-fuseraku
kuni ni araba
chichi tori-mimashi
ihe ni araba 25
haha tori-mimashi
yo no naka ha
kaku nomi narashi
inu zhi mono
michi ni fushite ya 30
inochi suginamu.

Hito yo ni ha 1
futatabi miyenu
chichi haha wo
kit k
okite ya nagaku
aga wakarenamu! 5

16 mi = mahari, or tract, vicinity.


For uchihisasu, tarachishino, tamahokono see List m. k.

67

Hinkiu mondou no uta.


Kaze mazhiri 1
ame furu yo no
ame mazhiri
yuki furu yo ha
sube mo naku 5
samuku shi areba
kata shiho wo
tori-tsudzushirohi
kasu-yu sake
uchi-susurohite 10
shihabukahi
hana bishi bishi ni
shikato aranu
hige kaki-nadete
are wo okite 15
hito ha arazhi to
hokorohedo
samuku shi areba
asa fusuma
hiki kagafuri 20
nuno katakinu
ari no kotogoto
kisohedomo
samuki yo sura wo
ware yori mo 25
madzushiki hito no
chichi haha ha
uwe-samukaramu
me kodomo ha
kohite nakuramu 30
kono toki ha
ika ni shitsutsu ka
na ga yo ha wataru
ametsuchi ha
hiroshi to ihedo 35
aga tame ha
saku ya narinuru
hi-tsuki ha
akashi to ihedo
aga tame ha 40
teri ya tamahanu
hito mina ka
a nomi ya shikaru
wakuraba ni
hito to ha aru wo 45
hitonami ni
are mo tsukuru wo
wata mo naki
nuno katakinu no
miru no goto 50
wawake sagareru
kakafu nomi
kata ni uchi-kake
fuse-iho no
mage iho no uchi ni 55
hita tsuchi ni
wara toki-shikite
chichi haha ha
makura no kata ni
me kodomo ha 60
ato no kata ni
kakumi-wite
urehi samayohi
kamado ni ha
keburi fuki-tatezu 65
koshiki ni ha
kumo no su kakite
ihi-kashiku
koto mo wasurete
nuye tori no 70
nodo yobi woru ni
itonokite
mizhikaki monowo
hashikiru to
iheru ga goto ku 75
shimoto toru
sato wosa ga kowe ha
neya-do made
ki tachi yobahinu
kaku bakari 80
subenaki mono ka
yo no naka no michi.

68

Yamanohe no Okura tonzhiu tsutsushimite tatematsuru [kô-kyo-


kôrai] no uta hitotsu.
Kamiyo yori 1
ihitsutekeraku
soramitsu
Yamato no kuni ha
sume kami no 5
itsukushiki kuni
kototama no
sakihafu kuni to
katari-tsugi
ihitsukahikeri 10
ima no yo no
hito mo kotogoto
me no mahe ni
mitari shiritari
hito saha ni 15
michite ha aredomo
takahikaru
hi no mikado
kamu nagara
mede no sakari ni 20
ame no shita
mawoshi-tamahishi
ihe no koto
yerabi-tamahite
ohomikoto 25
itadaki mochite
Morokoshi no
tohoki sakahi ni
tsukahasare
makari-imase 30
unahara no
he ni mo oki ni mo
kamu tsumari
ushi-haki imasu
moromoro no 35
ohomi kami-tachi
funa no he ni
michibiki mawoshi
ametsuchi no
ohomi kami-tachi 40
Yamato no
ohokuni mitama
hisakatano
ama no mi sora yu
ama kakeri 45
mi-watashi-tamahi
koto wohari
kaheramu hi ni ha
mata sara ni
ohomi kami-tachi 50
funa no he ni
mi-te uchi kakete
sumi-naha wo
hahetaru gotoku
[ajinosumu] 55
Chika no saki yori
ohotomo no
Mitsu no hamabi ni
tada hate ni
mi fune ha hatemu 60
tsutsumi naku
sakiku imashite
haya kaherimase!

7 the spirit or genius of language.


22 mawoshi, govern, administer.
23 Cp. the Spanish hidalgo.
34 ushi-haki (nushi-haki) = girt with dominion.
61 = tsutsuganaku, free from trouble or anxiety.
For soramitsu, takahikaru, hisakatano, ajinosumu see List m. k.

69

(Rōshin jiubyô) toshi wo hete kurushimi mata kora wo omofu uta


itsutsu (nagauta hitotsu).
Tamakiharu 1
uchi no kagiri ha
tahirakeku
yasuku mo aramu wo
koto mo naku 5
mo naku mo aramu wo
yo no naka no
ukeku tsurakeku
itonokite
itaki kidzu ni ha 10
karashiho wo
sosogu chifu gotoku
masumasu mo
omoki umani ni
uhani utsu to 15
ifu koto no goto
oi nite aru
aga mi no uhe ni
yamahi wo ra
kahahete shi areba 20
hiru ha mo
nagekahi kurashi
yoru ha mo
ikidzuki akashi
toshi nagaku 25
ya mishi watareba
tsuki kasane
urehi samayohi
kotogoto ha
shinana to ’mohedo 30
sabahenasu
sawaku kodomo wo
utsutete ha
shini ha shirazu
mitsutsu areba 35
kokoro ha moyenu
ka ni kaku ni
omohi-wadzurahi
ne nomi shi nakayu!

5 mo here is mourning.
9 = itodoshiku.
19 ra, a separated plural affix (rare).
26 ya = yoru.
30 shinamu.
33 sutsuru.
For tamakiharu, sabahenasu see List m. k.

70

Furuhi wo kofuru uta mitsu (naga uta hitotsu mizhika-uta futatsu).


Yo no hito no 1
tafuto mi negafu
nanakusa no
takara mo areba
nani semu ni 5
negahi-hori semu
waga naka no
umare idetaru
shiratamano
waga ko Furuhi ha 10
aka-hoshi no
akuru ashita ha
shikitaheno
toko no be sarazu
tateredomo 15
woredomo tomo ni
kaki-nadete
koto-tohi tahare
yufu-dzudzu no
yufube ni nareba 20
iza neyo to
te wo tadzusahari
chichi haha mo
uhe ha na sakari
sakikusano 25
naka ni wo nemu to
uruhashiku
shiga kataraheba
itsushika mo
hito to nari idete 30
ashikeku mo
yokeku mo mimu to
ohobuneno
omohi-tanomu ni
omohanu ni 35
yokoshima kaze no
nihaka ni mo
ohohi kitareba
semu sube no
tadoki wo shirani 40
shirotaheno
tasuki wo kake
maso-kagami
te ni torimochite
amatsukami 45
afugi kohi nomi
kunitsukami
fushite nukadzuki
kakarazu mo
kakari mo yoshiwe 50
ame tsuchi no
kami no mani-mani to
tachi-azari
waga kohi-nomedo
shimashiku mo 55
yokeku ha nashi ni
yauyau ni
katachi tsukuhori
asanasana
ifukoto yami 60
tamakiharu
inochi tahenure
tachi-wodori
ashi suri sakebi
fushi afugi 65
mune uchinageki
te ni motaru
aga ko tobashitsu
yo no naka no michi.

48 nuka = hitai.
53 tachi-azari, wander about distractedly.
55 = shibashiku.
57 An old form of ya-ya.
65 to lie supine.
68 = tobitsu. Here read ’aga ko … michi wo tobashitsu.
vv. 1-10 are introductory to Furuhi—they form a pre-adjunct.
11-28 shiga—describes Furuhi’s manner—the words iza neyo …
nemu being his; 28-34 the father’s hopes; 35-40 suggest the boy’s
illness; 41-54 the prayers and despair of the father; 55-62 the
gradual decline and death of Furuhi; 63 to end, the father’s grief at
his loss.
This lay repays close study as an example of the language of the
Manyôshiu.
For shiratamano, shikitaheno, sakikusano, ohobuneno,
shirotaheno, tamakiharu see List m. k.

Maki VI, Kami


Kusagusa no uta.

71

Rauyau (Rôyô) nanatose to ifu toshi midzunoto wi satsuki Yoshinu


no totsumiya ni idemaseru toki Kanamura ga yomeru uta hitotsu.
Tagi no he no 1
Mifune no yama ni
midzu-ye sashi
shizhi ni ohitaru
tsuganokino 5
iya tsugitsugi ni
yorodzu yo ni
kaku shi shirasamu
Mi-Yoshinu no
Akidzu no miya ha 10
kami kara ka
tafutokaruramu
kuni kara ka
migahoshikaramu
yama kaha wo 15
atsumi-sayakemi
ohomiya to
ube shi kami-yo yu
sadamekerashi mo.

Yama takami 1
shira-yufu hana ni
ochitagitsu
tagi no kafuchi ha
miredo akanu ka mo. 5

1-9 lead up to 10.


For tsuganokino, ochitagitsu see List m. k.

72

Kuramochi no Asomi Chitose ga yomeru uta hitotsu.


Umakori 1
aya ni tomoshiki
narukamino
oto nomi kikishi
Mi-Yoshinu no 5
maki tatsu yama yu
mi-kudaseba
kaha no se goto ni
ake-kureba
asa-giri tachi 10
yufu sareba
kahadzu naku nari
himo tokanu
tabi ni shi areba
a nomi shite 15
kiyoki kahara wo
miraku shi woshi mo.

1 umakori = umaki ori, pretty-woven.


1-5 introductory to Yoshinu; 6-12 descriptive; 13 to end, the poet’s
reflections.
For umakori, narukamino see List m. k.

73

Zhimuki (Jinki) hazhime toshi kinoye ne kaminadzuki itsuka no hi Ki


no kuni ni idemaseru toki Akahito ga yomeru uta hitotsu.
Yasumishishi 1
wago ohokimi no
totsu-miya to
tsukahematsureru
Sahika nu yu 5
so-gahi ni miyuru
oki tsu shima
kiyoki nagisa ni
kaze fukeba
shiranami sawaki 10
shiho hireba
tamamo karitsutsu
kami yo yori
shika so tafutoki
Tamatsushima yama. 15

74

(Jinki) futatose satsuki Yoshinu no totsu miya ni idemaseru toki Kasa


no Asomi Kanamura ga yomeru uta hitotsu.
Ashihikino 1
mi yama mo saya ni
ochi tagitsu
Yoshinu no kaha no
kaha no se no 5
kiyoki wo mireba
kami-he ni ha
chidori shiba-naki
shimo-he ni ha
kahadzu tsumayobu 10
momoshikino
oho-miya hito mo
wochi-kochi ni
shizhi ni shi areba
miru goto ni 15
aya ni tomoshimi
tamakadzura
tayuru koto naku
yorodzu yo ni
kaku shi mo ga mo to 20
ame-tsuchi no
kami wo so inoru
kashikokaredomo.

For ashihikino, momoshikino, tamakadzura see List m. k.

75

Yamabe no Sukune Akahito ga yomeru uta.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy