Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
TigerResearch
/
tigerbot-7b-base
like
3
Follow
Tiger Research
79
Text Generation
Transformers
PyTorch
llama
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
2
Deploy
Use this model
refs/pr/2
tigerbot-7b-base
27.9 GB
Ctrl+K
Ctrl+K
1 contributor
History:
8 commits
SFconvertbot
Adding `safetensors` variant of this model
c0d4b0a
verified
over 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
over 2 years ago
README.md
Safe
2.24 kB
Update README.md
over 2 years ago
added_tokens.json
Safe
88 Bytes
update tigerbot-7b-base-v3-tokenizer
over 2 years ago
config.json
Safe
640 Bytes
upload tigerbot-7b-base-v3
over 2 years ago
generation_config.json
Safe
132 Bytes
upload tigerbot-7b-base-v3
over 2 years ago
model-00001-of-00002.safetensors
9.94 GB
xet
Adding `safetensors` variant of this model
over 1 year ago
model-00002-of-00002.safetensors
4.01 GB
xet
Adding `safetensors` variant of this model
over 1 year ago
model.safetensors.index.json
28.1 kB
Adding `safetensors` variant of this model
over 1 year ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
9.94 GB
xet
upload tigerbot-7b-base-v3
over 2 years ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
4.01 GB
xet
upload tigerbot-7b-base-v3
over 2 years ago
pytorch_model.bin.index.json
Safe
26.8 kB
upload tigerbot-7b-base-v3
over 2 years ago
special_tokens_map.json
Safe
529 Bytes
update tigerbot-7b-base-v3-tokenizer
over 2 years ago
tokenizer.json
Safe
2.99 MB
update tigerbot-7b-base-v3-tokenizer
over 2 years ago
tokenizer.model
Safe
941 kB
xet
upload tigerbot-7b-base-v3-tokenizer
over 2 years ago
tokenizer_config.json
Safe
766 Bytes
Update tokenizer_config.json
over 2 years ago