汎用言語モデルBERTのpre-trainingを試す[NLP][BERT]
本記事では,2018年秋に登場し話題になったBERTのpre-trainingをとりあえず動かしてみるまでをレポート.
今回は,google-researchのリポジトリのサンプルテキストを使って動かすまでを紹介する.今後,自作のテキストを使ってpre-trainingする予定があるので,その布石として手順を残す.
BERTの実行環境を整える
過去書いた下の記事を参考に実行環境を整備する.
BERTのpre-trainingを実行
今回は,bertフォルダ内にあるsample.txt
と$BERT_BASE_DIR/vocab.txt
を使う.
sample.txt
は生テキストのデータで,文ごとに改行されており,文章ごとに空行を1つ挟んで改行されたデータになっている.
sample.txt
This text is included to make sure Unicode is handled properly: 力加勝北区ᴵᴺᵀᵃছজটডণত Text should be one-sentence-per-line, with empty lines between documents. This sample text is public domain and was randomly selected from Project Guttenberg. The rain had only ceased with the gray streaks of morning at Blazing Star, and the settlement awoke to a moral sense of cleanliness, and the finding of forgotten knives, tin cups, and smaller camp utensils, where the heavy showers had washed away the debris and dust heaps before the cabin doors. Indeed, it was recorded in Blazing Star that a fortunate early riser had once picked up on the highway a solid chunk of gold quartz which the rain had freed from its incumbering soil, and washed into immediate and glittering popularity. Possibly this may have been the reason why early risers in that locality, during the rainy season, adopted a thoughtful habit of body, and seldom lifted their eyes to the rifted or india-ink washed skies above them. "Cass" Beard had risen early that morning, but not with a view to discovery. A leak in his cabin roof,--quite consistent with his careless, improvident habits,--had roused him at 4 A. M., with a flooded "bunk" and wet blankets. The chips from his wood pile refused to kindle a fire to dry his bed-clothes, and he had recourse to a more provident neighbor's to supply the deficiency. This was nearly opposite. Mr. Cassius crossed the highway, and stopped suddenly. Something glittered in the nearest red pool before him. Gold, surely! But, wonderful to relate, not an irregular, shapeless fragment of crude ore, fresh from Nature's crucible, but a bit of jeweler's handicraft in the form of a plain gold ring. Looking at it more attentively, he saw that it bore the inscription, "May to Cass." Like most of his fellow gold-seekers, Cass was superstitious. The fountain of classic wisdom, Hypatia herself. As the ancient sage--the name is unimportant to a monk--pumped water nightly that he might study by day, so I, the guardian of cloaks and parasols, at the sacred doors of her lecture-room, imbibe celestial knowledge. From my youth I felt in me a soul above the matter-entangled herd. She revealed to me the glorious fact, that I am a spark of Divinity itself. A fallen star, I am, sir!' continued he, pensively, stroking his lean stomach--'a fallen star!--fallen, if the dignity of philosophy will allow of the simile, among the hogs of the lower world--indeed, even into the hog-bucket itself. Well, after all, I will show you the way to the Archbishop's. There is a philosophic pleasure in opening one's treasures to the modest young. Perhaps you will assist me by carrying this basket of fruit?' And the little man jumped up, put his basket on Philammon's head, and trotted off up a neighbouring street. Philammon followed, half contemptuous, half wondering at what this philosophy might be, which could feed the self-conceit of anything so abject as his ragged little apish guide; but the novel roar and whirl of the street, the perpetual stream of busy faces, the line of curricles, palanquins, laden asses, camels, elephants, which met and passed him, and squeezed him up steps and into doorways, as they threaded their way through the great Moon-gate into the ample street beyond, drove everything from his mind but wondering curiosity, and a vague, helpless dread of that great living wilderness, more terrible than any dead wilderness of sand which he had left behind. Already he longed for the repose, the silence of the Laura--for faces which knew him and smiled upon him; but it was too late to turn back now. His guide held on for more than a mile up the great main street, crossed in the centre of the city, at right angles, by one equally magnificent, at each end of which, miles away, appeared, dim and distant over the heads of the living stream of passengers, the yellow sand-hills of the desert; while at the end of the vista in front of them gleamed the blue harbour, through a network of countless masts. At last they reached the quay at the opposite end of the street; and there burst on Philammon's astonished eyes a vast semicircle of blue sea, ringed with palaces and towers. He stopped involuntarily; and his little guide stopped also, and looked askance at the young monk, to watch the effect which that grand panorama should produce on him.
上に実際のsample.txt
を示す.1行目をみると漢字とアラビア文字が入っているので,Unicode対応の文字であれば入力として問題無いようにみえる.TODO: 入力可能な文字種を調べる.
vocab.txt
は単語と文字が改行されて記載されているだけのテキストである.中身を観察すると,"##nsor"や"##ing"などの単語の接尾辞(サブワード)が含まれている.なお,##ingは「ing」で終わる単語の接尾辞を表しており,runnning, layingなどのように使われることが考えられる.また,tokenization.py
の中にBERTでtokenizeした時の例がコメントアウトで示されていたので,ついでに転載.
“##" がつく場合の例
For example: input = "unaffable" output = ["un", "##aff", "##able"]
vocab.txt
を作成するためのWordPiece libraryは,Google内部にしかなく,closeにされているので,もし,他の言語やテキストで学習する場合にはTokenizerを自作する必要がある.
ちなみに,Tokenizeしている部分のコードは,tokenization.py
にあり,このコードのFullTokenizer
クラスのtokenize
という関数を読むと,BasicTokenizer
で文章をtokenに分割した後,分割された各tokenをWordpieceTokenizer
でサブワードに分割されていることが確認できる.これらの前処理に関しては,別の記事で調査を行なった.
参考:https://techlife.cookpad.com/entry/2018/12/04/093000
vocab.txt
(一部抜粋)
wince wreath ##ticus hideout inspections sanjay disgrace infused pudding stalks ##urbed arsenic leases ##hyl ##rrard collarbone ##waite ##wil dowry ##bant ##edance genealogical nitrate salamanca scandals thyroid necessitated ##阿 ##陳 ##陽 ##雄 ##青 ##面 ##風 ##食 ##香 ##馬 ##高 ##龍 ##龸 ##fi ##fl ##! ##( ##) ##, ##- ##. ##/ ##: ##? ##~
各ファイルの中身を確認したところで,いよいよpre-trainingの実行に移る.
まずは,sample.txt
を穴埋め問題用の入力データに変換するため,create_pretraining_data.py
を実行.
python create_pretraining_data.py --input_file=./sample_text.txt --output_file=/tmp/tf_examples.tfrecord --vocab_file=$BERT_BASE_DIR/vocab.txt --do_lower_case=True --max_seq_length=128 --max_predictions_per_seq=20 --masked_lm_prob=0.15 --random_seed=12345 --dupe_factor=5
次に,pre-trainingを行うため,run_pretraining.py
を実行.
python run_pretraining.py --input_file=/tmp/tf_examples.tfrecord --output_dir=/tmp/pretraining_output --do_train=True --do_eval=True --bert_config_file=$BERT_BASE_DIR/bert_config.json --init_checkpoint=$BERT_BASE_DIR/bert_model.ckpt --train_batch_size=16 --max_seq_length=128 --max_predictions_per_seq=20 --num_train_steps=20 --num_warmup_steps=10 --lerning_rate=2e-5
run_pretraining.py
を実行する際,メモリエラーが発生したため,引数¥
–train_batch_size`を32から16に変更.
実行後,以下の出力が出ていたら,無事成功.
***** Eval results ***** global_step = 20 loss = 0.0979674 masked_lm_accuracy = 0.985479 masked_lm_loss = 0.0979328 next_sentence_accuracy = 1.0 next_sentence_loss = 3.45724e-05
ディスカッション
コメント一覧
まだ、コメントがありません