Added gold standard data.seq2seq with attention. Training works. Inference models have an error:
Traceback (most recent call last): File "seq2seq_attention.py", line 163, in <module> outputs=[decoder_out] + decoder_states ) File "/home/sevajuri/anaconda3/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper return func(*args, **kwargs) File "/home/sevajuri/anaconda3/lib/python3.6/site-packages/keras/engine/topology.py", line 1811, in __init__ str(layers_with_complete_input)) RuntimeError: Graph disconnected: cannot obtain value for tensor Tensor("input_1_1:0", shape=(?, 27), dtype=float32) at layer "input_1". The following previous layers were accessed without issue: ['input_2']
Showing
- code_jurica/_layers.py 57 additions, 88 deletionscode_jurica/_layers.py
- code_jurica/classificationICD10_attention.py 2 additions, 1 deletioncode_jurica/classificationICD10_attention.py
- code_jurica/loader.py 1 addition, 0 deletionscode_jurica/loader.py
- code_jurica/multiTaskEmbeddings.py 0 additions, 0 deletionscode_jurica/multiTaskEmbeddings.py
- code_jurica/seq2seq_attention.py 239 additions, 0 deletionscode_jurica/seq2seq_attention.py
Please register or sign in to comment