首页 >> 大全

bert使用指南

2023-10-07 大全 30 作者:考证青年

1. 简介

BERT 的核心过程非常简洁,它会先从数据集抽取两个句子,其中第二句是第一句的下一句的概率是 50%,这样就能学习句子之间的关系。其次随机去除两个句子中的一些词,并要求模型预测这些词是什么,这样就能学习句子内部的关系。最后再将经过处理的句子传入大型 模型,并通过两个损失函数同时学习上面两个目标就能完成训练。

2 运行(只有CPU) 2.1 运行环境

1.官方推荐.10以上的版本,可以搭建tf1.12

2.从gitee上下载整个项目,同时需要将仓库中Data目录下的-12_H-768_A-12文件夹和tmp文件夹一并加入到这个项目中去。

接下来就要开始预训练了

2.2 修改自己的文件路径名

在args.py下

""和’’修改为自己的tmp文件夹的路径即可

2.3 运行

python create_pretraining_data.py

得到类似如下输出

INFO:tensorflow:*** Example ***
INFO:tensorflow:tokens: [CLS] indeed , it was recorded in [MASK] star that a fortunate early [MASK] ##r had once picked up on the highway a solid chunk [MASK] gold quartz which the [MASK] had freed from its inc [MASK] ##ing soil , and washed into immediate and [MASK] popularity . [SEP] rainy season , [MASK] insult show habit of body , and seldom lifted their eyes to the rift ##ed [MASK] india - ink washed skies [MASK] them . " cass " beard [MASK] elliot early that morning , but not with a view to [MASK] . a leak in his [MASK] roof , - - quite [MASK] with his careless , imp ##rov ##ide ##nt habits , - - had rouse ##d him at 4 a [MASK] m [SEP]
INFO:tensorflow:input_ids: 101 5262 1010 2009 2001 2680 1999 103 2732 2008 1037 19590 2220 103 2099 2018 2320 3856 2039 2006 1996 3307 1037 5024 20000 103 2751 20971 2029 1996 103 2018 10650 2013 2049 4297 103 2075 5800 1010 1998 8871 2046 6234 1998 103 6217 1012 102 16373 2161 1010 103 15301 2265 10427 1997 2303 1010 1998 15839 4196 2037 2159 2000 1996 16931 2098 103 2634 1011 10710 8871 15717 103 2068 1012 1000 16220 1000 10154 103 11759 2220 2008 2851 1010 2021 2025 2007 1037 3193 2000 103 1012 1037 17271 1999 2010 103 4412 1010 1011 1011 3243 103 2007 2010 23358 1010 17727 12298 5178 3372 14243 1010 1011 1011 2018 27384 2094 2032 2012 1018 1037 103 1049 102
INFO:tensorflow:input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
INFO:tensorflow:segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
INFO:tensorflow:masked_lm_positions: 7 12 13 25 30 36 45 52 53 54 68 74 81 82 93 99 103 105 125 0
INFO:tensorflow:masked_lm_ids: 17162 2220 4125 1997 4542 29440 20332 4233 1037 16465 2030 2682 2018 13763 5456 6644 1011 8335 1012 0
INFO:tensorflow:masked_lm_weights: 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 0.0
INFO:tensorflow:next_sentence_labels: 0
INFO:tensorflow:*** Example ***
INFO:tensorflow:tokens: [CLS] and there burst on phil ##am ##mon ' s astonished eyes a vast semi ##ci ##rcle of blue sea [MASK] ring ##ed with palaces and towers [MASK] [SEP] like most of [MASK] fellow gold - seekers , cass was super ##sti [MASK] . [SEP]
INFO:tensorflow:input_ids: 101 1998 2045 6532 2006 6316 3286 8202 1005 1055 22741 2159 1037 6565 4100 6895 21769 1997 2630 2712 103 3614 2098 2007 22763 1998 7626 103 102 2066 2087 1997 103 3507 2751 1011 24071 1010 16220 2001 3565 16643 103 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
INFO:tensorflow:input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
INFO:tensorflow:segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
INFO:tensorflow:masked_lm_positions: 10 20 23 27 32 39 42 0 0 0 0 0 0 0 0 0 0 0 0 0
INFO:tensorflow:masked_lm_ids: 22741 1010 2007 1012 2010 2001 20771 0 0 0 0 0 0 0 0 0 0 0 0 0
INFO:tensorflow:masked_lm_weights: 1.0 1.0 1.0 1.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
INFO:tensorflow:next_sentence_labels: 1
INFO:tensorflow:Wrote 60 total instances

得到预训练的数据,并且再运行.py文件

python run_pretraining.py

指南使用指南__指南使用方法

最后会输出loss等相关信息

Instructions for updating:
Use `tf.data.experimental.map_and_batch(...)`.
INFO:tensorflow:Calling model_fn.
INFO:tensorflow:Running train on CPU
INFO:tensorflow:*** Features ***
INFO:tensorflow:  name = input_ids, shape = (32, 128)
INFO:tensorflow:  name = input_mask, shape = (32, 128)
INFO:tensorflow:  name = masked_lm_ids, shape = (32, 20)
INFO:tensorflow:  name = masked_lm_positions, shape = (32, 20)
INFO:tensorflow:  name = masked_lm_weights, shape = (32, 20)
INFO:tensorflow:  name = next_sentence_labels, shape = (32, 1)
INFO:tensorflow:  name = segment_ids, shape = (32, 128)INFO:tensorflow:Done calling model_fn.
INFO:tensorflow:Create CheckpointSaverHook.
INFO:tensorflow:Graph was finalized.
INFO:tensorflow:Restoring parameters from /home/tongji/Bert_word2vec/pretraining_output/model.ckpt-2000
INFO:tensorflow:Running local_init_op.
INFO:tensorflow:Done running local_init_op

2.4 运行结果

一共进行了20000轮训练,对于下一个句子的准确率可以达到0.99

由于本次没有用GPU运行,运行总时间接近秒也就是4天多。

3 GPU环境下运行 3.1运行环境

.x下,使用-gpu, 需要注意的是 版本号要和自己的cuda对应!!!

不然就会出现找不到版本号的错误

查看自己的cuda版本号:

cat /usr/local/cuda/version.txt

查看g++版本:

g++ --version

_指南使用指南_指南使用方法

查看信息

pip list | grep tensorflow

安装对应gpu 时可以:

服务器上的cuda为10.0所以装13.1的版本

pip --default-timeout=1000 install -U tensorflow-gpu==1.13.1  -i https://pypi.tuna.tsinghua.edu.cn/simple/

3.2 运行

python create_pretraining_data.py

输出结果如下:

INFO:tensorflow:*** start time ***
INFO:tensorflow:  1603274172.5969179
INFO:tensorflow:*** Reading from input files ***
INFO:tensorflow:  /mnt/home/tongji/car-corpus-lmodels/Models/Bert_word2vec/tmp/jieba_cut_output.txt.530
INFO:tensorflow:*** read time ***
INFO:tensorflow:  39.821722984313965
INFO:tensorflow:*** Writing to output files ***
INFO:tensorflow:  /mnt/home/tongji/car-corpus-lmodels/Models/Bert_word2vec/tmp/530.tfrecord
INFO:tensorflow:*** Example ***
INFO:tensorflow:tokens: [CLS] [MASK] 用 的 是 [MASK] 功 率 版 2 . 0 [MASK] ##si 发 动 机 ##ner 最 [MASK] [MASK] 出 功 [MASK] 165 千 瓦 痼 224 马 力 ) , 峰 值 [MASK] 矩 350 [MASK] [MASK] 米 。 因 为 搭 [MASK] 了 奥 迪 b c ##y ##cle 循 [MASK] 技 术 , 提 升 了 燃 油 经 [SEP] 胶 气 动 装 置 , 无 论 是 设 计 手 法 和 作 用 效 果 , 在 如 [MASK] 看 来 都 不 十 分 到 位 [MASK] 如 今 air li ##ft 的 气 动 避 震 产 品 广 泛 [MASK] 于 [MASK] 种 车 [MASK] 中 。 大 [MASK] 速 腾 , 除 了 先 天 [SEP]
INFO:tensorflow:input_ids: 101 103 4500 4638 3221 103 1216 4372 4276 123 119 121 103 9182 1355 1220 3322 8957 3297 103 103 1139 1216 103 9316 1283 4482 4593 10629 7716 1213 8021 8024 2292 966 103 4762 8612 103 103 5101 511 1728 711 3022 103 749 1952 6832 144 145 8179 11619 2542 103 2825 3318 8024 2990 1285 749 4234 3779 5307 102 5540 3698 1220 6163 5390 8024 3187 6389 3221 6392 6369 2797 3791 1469 868 4500 3126 3362 8024 1762 1963 103 4692 3341 6963 679 1282 1146 1168 855 103 1963 791 8523 9341 9002 4638 3698 1220 6912 7448 772 1501 2408 3793 103 754 103 4905 6756 103 704 511 1920 103 6862 5596 8024 7370 749 1044 1921 102
INFO:tensorflow:input_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
INFO:tensorflow:segment_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
INFO:tensorflow:masked_lm_positions: 1 5 12 17 19 20 23 27 35 38 39 45 54 86 95 110 112 115 119 0
INFO:tensorflow:masked_lm_ids: 7023 7770 12719 8024 1920 6783 4372 8020 2814 4281 185 6770 4384 791 8024 4500 1392 1798 830 0
INFO:tensorflow:masked_lm_weights: 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 0.0
INFO:tensorflow:next_sentence_labels: 1INFO:tensorflow:*** write time ***
INFO:tensorflow:  2.3067338466644287
INFO:tensorflow:*** all time ***
INFO:tensorflow:  42.128456830978394

python run_pretraining.py

3.3运行结果如下

INFO:tensorflow:Done running local_init_op.
INFO:tensorflow:Evaluation [10/100]
INFO:tensorflow:Evaluation [20/100]
INFO:tensorflow:Evaluation [30/100]
INFO:tensorflow:Evaluation [40/100]
INFO:tensorflow:Evaluation [50/100]
INFO:tensorflow:Evaluation [60/100]
INFO:tensorflow:Evaluation [70/100]
INFO:tensorflow:Evaluation [80/100]
INFO:tensorflow:Evaluation [90/100]
INFO:tensorflow:Evaluation [100/100]
INFO:tensorflow:Finished evaluation at 2020-10-21-11:44:22
INFO:tensorflow:Saving dict for global step 20000: global_step = 20000, loss = 4.2188535e-06, masked_lm_accuracy = 1.0, masked_lm_loss = 4.20574e-06, next_sentence_accuracy = 1.0, next_sentence_loss = 1.31130085e-08
INFO:tensorflow:Saving 'checkpoint_path' summary for global step 20000: /mnt/home/tongji/car-corpus-lmodels/Models/Bert_word2vec/pretraining_output/model.ckpt-20000
INFO:tensorflow:evaluation_loop marked as finished
INFO:tensorflow:***** Eval results *****
INFO:tensorflow:  global_step = 20000
INFO:tensorflow:  loss = 4.2188535e-06
INFO:tensorflow:  masked_lm_accuracy = 1.0
INFO:tensorflow:  masked_lm_loss = 4.20574e-06
INFO:tensorflow:  next_sentence_accuracy = 1.0
INFO:tensorflow:  next_sentence_loss = 1.31130085e-08
INFO:tensorflow:*** all time ***
INFO:tensorflow:Wed Oct 21 19:44:23 2020
INFO:tensorflow:秒:6214.46689581871

时间相比如CPU 缩短了很多

4 总结

本文是新手的运行项目代码的手册,目的是将输入的语料经过bert预训练为后面识别提高准确度。可以看出GPU训练的时间比CPU短

关于我们

最火推荐

小编推荐

联系我们


版权声明:本站内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容, 请发送邮件至 88@qq.com 举报,一经查实,本站将立刻删除。备案号:桂ICP备2021009421号
Powered By Z-BlogPHP.
复制成功
微信号:
我知道了