deluca
Performant, differentiable reinforcement learning
Notes
- This is pre-alpha software and is undergoing a number of core changes. Updates to follow.
- Please see the examples for guidance on how to use
deluca
deluca
Performant, differentiable reinforcement learning
deluca
Hi.
I am trying to install deluca and I get an Exception error. I am using
Ubuntu 64 on a virtual machine Pycharm CE 2021.2, Python 3.8 pip 212.1.2
I tried to install deluca with the package manager in Pycharm, the terminal in Pycharm and also the Ubuntu terminal. The error is the same. Note that I can install other normal packages like Numpy, Scipy, etc with no problem. Thanks in advance and I am looking forward to using this amazing package!
pip install deluca
Collecting deluca
Using cached deluca-0.0.17-py3-none-any.whl (52 kB)
Collecting flax
Using cached flax-0.3.4-py3-none-any.whl (183 kB)
Collecting brax
Using cached brax-0.0.4-py3-none-any.whl (117 kB)
Processing
./.cache/pip/wheels/78/ae/07/bd3adac873fa80efc909c09331831905ac657dbb8d1278235e/jax-0.2.19-py3-none-any.whl
Collecting optax
Using cached optax-0.0.9-py3-none-any.whl (118 kB)
Collecting scipy
Using cached
scipy-1.7.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (28.4 MB)
Collecting numpy
Using cached
numpy-1.21.2-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
(15.8 MB)
Collecting matplotlib
Using cached matplotlib-3.4.3-cp38-cp38-manylinux1_x86_64.whl (10.3 MB)
Collecting msgpack
Using cached msgpack-1.0.2-cp38-cp38-manylinux1_x86_64.whl (302 kB)
Collecting grpcio
Using cached grpcio-1.39.0-cp38-cp38-manylinux2014_x86_64.whl (4.3 MB)
Collecting clu
Using cached clu-0.0.6-py3-none-any.whl (77 kB)
Collecting gym
Using cached gym-0.19.0.tar.gz (1.6 MB)
Collecting absl-py
Using cached absl_py-0.13.0-py3-none-any.whl (132 kB)
Collecting tfp-nightly[jax]<=0.13.0.dev20210422
Using cached tfp_nightly-0.13.0.dev20210422-py2.py3-none-any.whl (5.3 MB)
Collecting jaxlib
Using cached jaxlib-0.1.70-cp38-none-manylinux2010_x86_64.whl (46.9 MB)
Collecting dataclasses
Using cached dataclasses-0.6-py3-none-any.whl (14 kB)
Collecting opt-einsum
Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting chex>=0.0.4
Using cached chex-0.0.8-py3-none-any.whl (57 kB)
Requirement already satisfied: pillow>=6.2.0 in
/usr/lib/python3/dist-packages (from matplotlib->flax->deluca) (7.0.0)
Collecting cycler>=0.10
Using cached cycler-0.10.0-py2.py3-none-any.whl (6.5 kB)
Collecting pyparsing>=2.2.1
Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting kiwisolver>=1.0.1
Using cached kiwisolver-1.3.1-cp38-cp38-manylinux1_x86_64.whl (1.2 MB)
Requirement already satisfied: python-dateutil>=2.7 in
/usr/lib/python3/dist-packages (from matplotlib->flax->deluca) (2.7.3)
Requirement already satisfied: six>=1.5.2 in
/usr/lib/python3/dist-packages (from grpcio->brax->deluca) (1.14.0)
Collecting tensorflow-datasets
Using cached tensorflow_datasets-4.4.0-py3-none-any.whl (4.0 MB)
Collecting packaging
Using cached packaging-21.0-py3-none-any.whl (40 kB)
Collecting ml-collections
Using cached ml_collections-0.1.0-py3-none-any.whl (88 kB)
Collecting tensorflow
Downloading tensorflow-2.6.0-cp38-cp38-manylinux2010_x86_64.whl
(458.4 MB)
|▋ | 8.4 MB 16 kB/s eta
7:44:54ERROR: Exception:
Traceback (most recent call last):
File
"/usr/share/python-wheels/urllib3-1.25.8-py2.py3-none-any.whl/urllib3/response.py",
line 425, in _error_catcher
yield
File
"/usr/share/python-wheels/urllib3-1.25.8-py2.py3-none-any.whl/urllib3/response.py",
line 507, in read
data = self._fp.read(amt) if not fp_closed else b""
File
"/usr/share/python-wheels/CacheControl-0.12.6-py2.py3-none-any.whl/cachecontrol/filewrapper.py",
line 62, in read
data = self.__fp.read(amt)
File "/usr/lib/python3.8/http/client.py", line 455, in read
n = self.readinto(b)
File "/usr/lib/python3.8/http/client.py", line 499, in readinto
n = self.fp.readinto(b)
File "/usr/lib/python3.8/socket.py", line 669, in readinto
return self._sock.recv_into(b)
File "/usr/lib/python3.8/ssl.py", line 1241, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib/python3.8/ssl.py", line 1099, in read
return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/lib/python3/dist-packages/pip/_internal/cli/base_command.py", line
186, in _main
status = self.run(options, args)
File
"/usr/lib/python3/dist-packages/pip/_internal/commands/install.py", line
357, in run
resolver.resolve(requirement_set)
File
"/usr/lib/python3/dist-packages/pip/_internal/legacy_resolve.py", line
177, in resolve
discovered_reqs.extend(self._resolve_one(requirement_set, req))
File
"/usr/lib/python3/dist-packages/pip/_internal/legacy_resolve.py", line
333, in _resolve_one
abstract_dist = self._get_abstract_dist_for(req_to_install)
File
"/usr/lib/python3/dist-packages/pip/_internal/legacy_resolve.py", line
282, in _get_abstract_dist_for
abstract_dist = self.preparer.prepare_linked_requirement(req)
File
"/usr/lib/python3/dist-packages/pip/_internal/operations/prepare.py",
line 480, in prepare_linked_requirement
local_path = unpack_url(
File
"/usr/lib/python3/dist-packages/pip/_internal/operations/prepare.py",
line 282, in unpack_url
return unpack_http_url(
File
"/usr/lib/python3/dist-packages/pip/_internal/operations/prepare.py",
line 158, in unpack_http_url
from_path, content_type = _download_http_url(
File
"/usr/lib/python3/dist-packages/pip/_internal/operations/prepare.py",
line 303, in _download_http_url
for chunk in download.chunks:
File "/usr/lib/python3/dist-packages/pip/_internal/utils/ui.py", line
160, in iter
for x in it:
File "/usr/lib/python3/dist-packages/pip/_internal/network/utils.py",
line 15, in response_chunks
for chunk in response.raw.stream(
File
"/usr/share/python-wheels/urllib3-1.25.8-py2.py3-none-any.whl/urllib3/response.py",
line 564, in stream
data = self.read(amt=amt, decode_content=decode_content)
File
"/usr/share/python-wheels/urllib3-1.25.8-py2.py3-none-any.whl/urllib3/response.py",
line 529, in read
raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
File "/usr/lib/python3.8/contextlib.py", line 131, in __exit__
self.gen.throw(type, value, traceback)
File
"/usr/share/python-wheels/urllib3-1.25.8-py2.py3-none-any.whl/urllib3/response.py",
line 430, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
urllib3.exceptions.ReadTimeoutError:
HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed
out.
Internal change
FUTURE_COPYBARA_INTEGRATE_REVIEW=https://github.com/google/deluca/pull/57 from google:inverse_map baa4932444495538d91151653165cdcb386b52fc
Internal change
FUTURE_COPYBARA_INTEGRATE_REVIEW=https://github.com/google/deluca/pull/57 from google:inverse_map baa4932444495538d91151653165cdcb386b52fc
Internal change
FUTURE_COPYBARA_INTEGRATE_REVIEW=https://github.com/google/deluca/pull/57 from google:inverse_map baa4932444495538d91151653165cdcb386b52fc
cla: yesInternal change
FUTURE_COPYBARA_INTEGRATE_REVIEW=https://github.com/google/deluca/pull/57 from google:inverse_map baa4932444495538d91151653165cdcb386b52fc
cla: yesHello! I made some modifications to AdaGPC (in _adaptive.py). In the existing implementation, GPC outperforms AdaGPC in the known LDS setting, which is the opposite of what one should expect. Based on some preliminary experiments, I believe AdaGPC is now working properly (at least in the known dynamics version). (I also made some miscellaneous changes in other files, e.g., to the imports in some of the agent files -- I think there might have been some file restructuring across different versions of deluca, but the imports were not updated to reflect this change, causing some errors at runtime.) Please let me know if you have any questions/concerns. Thanks!
[JAX] Avoid private implementation detail _ScalarMeta.
The closest public approximation to type(jnp.float32) is type[Any]
. Nothing is ever actually an instance of one of these types, either (they build DeviceArray
s if instantiated.)
[JAX] Avoid private implementation detail _ScalarMeta.
The closest public approximation to type(jnp.float32) is type[Any]
. Nothing is ever actually an instance of one of these types, either (they build DeviceArray
s if instantiated.)
Internal change
FUTURE_COPYBARA_INTEGRATE_REVIEW=https://github.com/google/deluca/pull/57 from google:inverse_map baa4932444495538d91151653165cdcb386b52fc
Hi
Thanks for providing this interesting package.
I am trying to test drc on a simple setup and I notice that the current implementation of drc does not work. I mean when I try it for a simple partially observable linear system with A = np.array([[1.0 0.95], [0.0, -0.9]]), B = np.array([[0.0], [1.0]]) C = np.array([[1.0, 0]]) Q , R = I gaussian process noise, zero observation noise which is open loop stable, the controller acts like a zero controller. I tried to get a different response by setting the hyperparameters but they are mostly the same. Then I looked at the implementation at the deluca github and I noticed that the counterfactual cost is not defined correctly (if I am not wrong). According to Algorithm 1 in [1], we need to use M_t to compute y_t (which depends on the previous controls (u) using again M_t) but in the implementation, the previous controls based on M_{t-i} are used. Anyway, I implemented the algorithm using M_t but what I get after the simulation is either close to zero control or an unstable one.
I was wondering if you have any code example for the DRC algorithm that works? [1] Simchowitz, Max and Singh, Karan and Hazan, Elad, "Improper learning for non-stochastic control", COLT 2020.
Thanks a lot, Sincerely, Farnaz
Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)Please see https://readthedocs.org/projects/deluca for details about this release.
Source code(tar.gz)The first comprehensive Robustness investigation benchmark on large-scale dataset ImageNet regarding ARchitecture design and Training techniques towards diverse noises.
TopFormer: Token Pyramid Transformer for Mobile Semantic Segmentation Paper Links: TopFormer: Token Pyramid Transformer for Mobile Semantic Segmentati
Learning with Noisy Labels via Sparse Regularization This repository is the official implementation of [Learning with Noisy Labels via Sparse Regulari
neural image analogies This is basically an implementation of this "Image Analogies" paper, In our case, we use feature maps from VGG16. The patch mat
Generative Image Inpainting An open source framework for generative image inpainting task, with the support of Contextual Attention (CVPR 2018) and Ga
YOLOv5DS Multi-task yolov5 with detection and segmentation based on yolov5(branch v6.0) decoupled head anchor free segmentation head README中文 Ablation
Time Series Augmentation This is a collection of time series data augmentation methods and an example use using Keras. News 2020/04/16: Repository Cre
Invariant Causal Imitation Learning for Generalizable Policies Ioana Bica, Daniel Jarrett, Mihaela van der Schaar Neural Information Processing System
用强化学习玩合成大西瓜 代码地址:https://github.com/Sharpiless/play-daxigua-using-Reinforcement-Learning 用强化学习DQN算法,训练AI模型来玩合成大西瓜游戏,提供Keras版本、PARL(paddle)版本和pytorch版本
HDBSCAN HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over varying epsilon values and integrates
Spiking Neural Network training with EventProp This is an unofficial PyTorch implemenation of EventProp, a method to compute exact gradients for Spiki
Keras-GAN-Animeface-Character GAN example for Keras. Cuz MNIST is too small and there should an example on something more realistic. Some results Trai
Flink AI Flow Introduction Flink AI Flow is an open source framework that bridges big data and artificial intelligence. It manages the entire machine
General Multi-label Image Classification with Transformers Jack Lanchantin, Tianlu Wang, Vicente Ordóñez Román, Yanjun Qi Conference on Computer Visio
color-matcher Description color-matcher enables color transfer across images which comes in handy for automatic color-grading of photographs, painting
CPC_audio This code implements the Contrast Predictive Coding algorithm on audio data, as described in the paper Unsupervised Pretraining Transfers we
Improving Factual Consistency of Abstractive Text Summarization We provide the code for the papers: "Entity-level Factual Consistency of Abstractive T
Forecasting with the Temporal Fusion Transformer Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invari
CVTLowpoly: Image Lowpoly via Centroid Voronoi Diagram Image Sharp Feature Extraction using Guide Filter's Local Linear Theory via opencv-python. The
AALpy An Active Automata Learning Library AALpy is a light-weight active automata learning library written in pure Python. You can start learning auto