Skip to content

Commit 28c19ab

Browse files
authored
Make it easier to develop without a dev install (huggingface#22697)
* Make it easier to develop without a dev install * Remove ugly hack that doesn't work anyway
1 parent 4c01231 commit 28c19ab

File tree

6 files changed

+62
-9
lines changed

6 files changed

+62
-9
lines changed

CONTRIBUTING.md

+7-5
Original file line numberDiff line numberDiff line change
@@ -162,14 +162,16 @@ You'll need **[Python 3.7]((https://github.com/huggingface/transformers/blob/mai
162162
it with `pip uninstall transformers` before reinstalling it in editable
163163
mode with the `-e` flag.
164164

165-
Depending on your OS, you may need to install some external libraries as well if the `pip` installation fails.
166-
167-
For macOS, you will likely need [MeCab](https://taku910.github.io/mecab/) which can be installed from Homebrew:
168-
165+
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
166+
failure with this command. If that's the case make sure to install the Deep Learning framework you are working with
167+
(PyTorch, TensorFlow and/or Flax) then do:
168+
169169
```bash
170-
brew install mecab
170+
pip install -e ".[quality]"
171171
```
172172

173+
which should be enough for most use cases.
174+
173175
5. Develop the features on your branch.
174176

175177
As you work on your code, you should make sure the test suite

docs/source/en/add_new_model.mdx

+9-1
Original file line numberDiff line numberDiff line change
@@ -202,7 +202,15 @@ source .env/bin/activate
202202
pip install -e ".[dev]"
203203
```
204204

205-
and return to the parent directory
205+
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
206+
failure with this command. If that's the case make sure to install the Deep Learning framework you are working with
207+
(PyTorch, TensorFlow and/or Flax) then do:
208+
209+
```bash
210+
pip install -e ".[quality]"
211+
```
212+
213+
which should be enough for most use cases. You can then return to the parent directory
206214

207215
```bash
208216
cd ..

docs/source/en/add_tensorflow_model.mdx

+7
Original file line numberDiff line numberDiff line change
@@ -119,6 +119,13 @@ source .env/bin/activate
119119
pip install -e ".[dev]"
120120
```
121121

122+
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
123+
failure with this command. If that's the case make sure to install TensorFlow then do:
124+
125+
```bash
126+
pip install -e ".[quality]"
127+
```
128+
122129
**Note:** You don't need to have CUDA installed. Making the new model work on CPU is sufficient.
123130

124131
4. Create a branch with a descriptive name from your main branch

docs/source/en/pr_checks.mdx

+13-2
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ When you open a pull request on 🤗 Transformers, a fair number of checks will
2424

2525
In this document, we will take a stab at explaining what those various checks are and the reason behind them, as well as how to debug them locally if one of them fails on your PR.
2626

27-
Note that they all require you to have a dev install:
27+
Note that, ideally, they require you to have a dev install:
2828

2929
```bash
3030
pip install transformers[dev]
@@ -36,7 +36,18 @@ or for an editable install:
3636
pip install -e .[dev]
3737
```
3838

39-
inside the Transformers repo.
39+
inside the Transformers repo. Since the number of optional dependencies of Transformers has grown a lot, it's possible you don't manage to get all of them. If the dev install fails, make sure to install the Deep Learning framework you are working with (PyTorch, TensorFlow and/or Flax) then do
40+
41+
```bash
42+
pip install transformers[quality]
43+
```
44+
45+
or for an editable install:
46+
47+
```bash
48+
pip install -e .[quality]
49+
```
50+
4051

4152
## Tests
4253

templates/adding_a_new_model/README.md

+16
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,14 @@ cd transformers
3434
pip install -e ".[dev]"
3535
```
3636

37+
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
38+
failure with this command. If that's the case make sure to install the Deep Learning framework you are working with
39+
(PyTorch, TensorFlow and/or Flax) then do:
40+
41+
```bash
42+
pip install -e ".[quality]"
43+
```
44+
3745
Once the installation is done, you can use the CLI command `add-new-model` to generate your models:
3846

3947
```shell script
@@ -133,6 +141,14 @@ cd transformers
133141
pip install -e ".[dev]"
134142
```
135143

144+
Depending on your OS, and since the number of optional dependencies of Transformers is growing, you might get a
145+
failure with this command. If that's the case make sure to install the Deep Learning framework you are working with
146+
(PyTorch, TensorFlow and/or Flax) then do:
147+
148+
```bash
149+
pip install -e ".[quality]"
150+
```
151+
136152
Once the installation is done, you can use the CLI command `add-new-model-like` to generate your models:
137153

138154
```shell script

utils/check_inits.py

+10-1
Original file line numberDiff line numberDiff line change
@@ -277,11 +277,20 @@ def check_submodules():
277277

278278
transformers = direct_transformers_import(PATH_TO_TRANSFORMERS)
279279

280+
import_structure_keys = set(transformers._import_structure.keys())
281+
# This contains all the base keys of the _import_structure object defined in the init, but if the user is missing
282+
# some optional dependencies, they may not have all of them. Thus we read the init to read all additions and
283+
# (potentiall re-) add them.
284+
with open(os.path.join(PATH_TO_TRANSFORMERS, "__init__.py"), "r") as f:
285+
init_content = f.read()
286+
import_structure_keys.update(set(re.findall(r"import_structure\[\"([^\"]*)\"\]", init_content)))
287+
280288
module_not_registered = [
281289
module
282290
for module in get_transformers_submodules()
283-
if module not in IGNORE_SUBMODULES and module not in transformers._import_structure.keys()
291+
if module not in IGNORE_SUBMODULES and module not in import_structure_keys
284292
]
293+
285294
if len(module_not_registered) > 0:
286295
list_of_modules = "\n".join(f"- {module}" for module in module_not_registered)
287296
raise ValueError(

0 commit comments

Comments
 (0)