Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to install correctly on windows 11? #457

Open
nexon33 opened this issue Jan 18, 2023 · 1 comment
Open

How to install correctly on windows 11? #457

nexon33 opened this issue Jan 18, 2023 · 1 comment

Comments

@nexon33
Copy link

nexon33 commented Jan 18, 2023

Please make sure that this is a Bug or a Feature Request and provide all applicable information asked by the template.
If your issue is an implementation question, please ask your question on StackOverflow or on the TensorTrade Discord #help-desk channel instead of opening a GitHub issue.

System information

  • I just used the default pip install tensortrade and git install tensortrade
  • Windows 11 and conda
  • TensorTrade version: (1.0.3) and I also tried the latest github version (1.0.4)
  • TensorFlow version: 2.11.0
  • Python version: 3.8.15

The examples (use_lstm_rllib.ipynb and use_attentionnet_rllib.ipynb) error out on

analysis = tune.run(
    "PPO",

with the error

 from ray.rllib.env.utils import record_env_wrapper
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\rllib\env\utils.py", line 57, in <module>
    class VideoMonitor(wrappers.Monitor):
AttributeError: module 'gym.wrappers' has no attribute 'Monitor'
2023-01-18 13:13:07,081	INFO trial_runner.py:360 -- Restarting experiment.

which I tried to fix by downgrading gym to version 0.22.0 but then I just end up getting another error:

2023-01-18 13:16:46,417	ERROR ray_trial_executor.py:621 -- Trial PPO_TradingEnv_fa1e7fdf: Unexpected error starting runner.
Traceback (most recent call last):
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\tune\ray_trial_executor.py", line 612, in start_trial
    return self._start_trial(trial, checkpoint, train=train)
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\tune\ray_trial_executor.py", line 490, in _start_trial
    runner = self._setup_remote_runner(trial)
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\tune\ray_trial_executor.py", line 288, in _setup_remote_runner
    trial.init_logdir()
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\tune\trial.py", line 495, in init_logdir
    self.logdir = create_logdir(self._generate_dirname(),
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\site-packages\ray\tune\trial.py", line 163, in create_logdir
    os.makedirs(logdir, exist_ok=True)
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  [Previous line repeated 2 more times]
  File "c:\Users\user\anaconda3\envs\tenstrade\lib\os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [WinError 123] De syntaxis van de bestandsnaam, mapnaam of volumenaam is onjuist: 'C:\\Users\\user\\ray_results\\PPO\\PPO_TradingEnv_fa1e7fdf_2_clip_rewards=True,entropy_coeff=0.0061681,csv_filename=c:'

The example files should run without error

Code to reproduce the issue

conda create -n tensortrade python=3.8
conda activate tensortrade
pip install tensortrade

I also tried

git clone https://github.com/tensortrade-org/tensortrade.git
cd tensortrade
pip install -r requirements.txt
pip install -r examples/requirements.txt
pip install -e "."

I just want to get tensortrade working on windows with anaconda

@nexon33
Copy link
Author

nexon33 commented Jan 18, 2023

I have been debugging and I managed to fix some of the errors, currently I'm on python 3.9.15

I got down to the error "ValueError: Could not get observation and action spaces from remote worker. Maybe specify them manually in the config?" when running "use_attentionnet_rllib.ipynb"

(pid=6356) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow\python\framework\dtypes.py:246: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=6356)   np.bool8: (False, True),
(pid=6356) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\tune\logger\tensorboardx.py:35: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=6356)   VALID_NP_HPARAMS = (np.bool8, np.float32, np.float64, np.int32, np.int64)
(pid=6356) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\skimage\util\dtype.py:27: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=6356)   np.bool8: (False, True),
(pid=6356) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow_probability\python\__init__.py:57: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
(pid=6356)   if (distutils.version.LooseVersion(tf.__version__) <
(PPO pid=6356) 2023-01-18 16:57:31,287	WARNING algorithm_config.py:488 -- Cannot create PPOConfig from given `config_dict`! Property __stdout_file__ not supported.
(PPO pid=6356) 2023-01-18 16:57:32,025	INFO algorithm.py:501 -- Current log_level is ERROR. For more information, set 'log_level': 'INFO' / 'DEBUG' or use the -v and -vv flags.
(pid=23056) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow\python\framework\dtypes.py:246: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=23056)   np.bool8: (False, True),
(pid=26092) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow\python\framework\dtypes.py:246: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=26092)   np.bool8: (False, True),
(pid=23056) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\tune\logger\tensorboardx.py:35: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=23056)   VALID_NP_HPARAMS = (np.bool8, np.float32, np.float64, np.int32, np.int64)
(pid=23056) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\skimage\util\dtype.py:27: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=23056)   np.bool8: (False, True),
(pid=26092) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\tune\logger\tensorboardx.py:35: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=26092)   VALID_NP_HPARAMS = (np.bool8, np.float32, np.float64, np.int32, np.int64)
(pid=26092) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\skimage\util\dtype.py:27: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`.  (Deprecated NumPy 1.24)
(pid=26092)   np.bool8: (False, True),
(pid=23056) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow_probability\python\__init__.py:57: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
(pid=23056)   if (distutils.version.LooseVersion(tf.__version__) <
(pid=26092) c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\tensorflow_probability\python\__init__.py:57: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
(pid=26092)   if (distutils.version.LooseVersion(tf.__version__) <
2023-01-18 16:58:10,428	ERROR trial_runner.py:1088 -- Trial PPO_TradingEnv_4c94f63a: Error processing event.
ray.tune.error._TuneNoNextExecutorEventError: Traceback (most recent call last):
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\tune\execution\ray_trial_executor.py", line 1070, in get_next_executor_event
    future_result = ray.get(ready_future)
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\_private\client_mode_hook.py", line 105, in wrapper
    return func(*args, **kwargs)
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\_private\worker.py", line 2311, in get
    raise value
ray.exceptions.RayActorError: The actor died because of an error raised in its creation task, ray::PPO.__init__() (pid=6356, ip=127.0.0.1, repr=PPO)
  File "python\ray\_raylet.pyx", line 830, in ray._raylet.execute_task
  File "python\ray\_raylet.pyx", line 834, in ray._raylet.execute_task
  File "python\ray\_raylet.pyx", line 780, in ray._raylet.execute_task.function_executor
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\_private\function_manager.py", line 674, in actor_method_executor
    return method(__ray_actor, *args, **kwargs)
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\util\tracing\tracing_helper.py", line 466, in _resume_span
    return method(self, *_args, **_kwargs)
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\rllib\algorithms\algorithm.py", line 441, in __init__
    super().__init__(
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\tune\trainable\trainable.py", line 169, in __init__
    self.setup(copy.deepcopy(self.config))
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\util\tracing\tracing_helper.py", line 466, in _resume_span
    return method(self, *_args, **_kwargs)
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\rllib\algorithms\algorithm.py", line 566, in setup
    self.workers = WorkerSet(
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\rllib\evaluation\worker_set.py", line 169, in __init__
    self._setup(
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\rllib\evaluation\worker_set.py", line 253, in _setup
    spaces = self._get_spaces_from_remote_worker()
  File "c:\Users\user\anaconda3\envs\tensortrade\lib\site-packages\ray\rllib\evaluation\worker_set.py", line 287, in _get_spaces_from_remote_worker
    raise ValueError(
ValueError: Could not get observation and action spaces from remote worker. Maybe specify them manually in the config?

@nexon33 nexon33 closed this as completed Jan 18, 2023
@nexon33 nexon33 reopened this Jan 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant