Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting Assertion Error #1

Open
Arpitrf opened this issue Nov 4, 2018 · 1 comment
Open

Getting Assertion Error #1

Arpitrf opened this issue Nov 4, 2018 · 1 comment

Comments

@Arpitrf
Copy link

Arpitrf commented Nov 4, 2018

Traceback (most recent call last):
File "training.py", line 203, in
validation_data=(process_features(data.validation, data.metadata), data.validation_labels))
File "/Users/apple/Desktop/Kaggle/Taxi Trajectory/vir/lib/python3.6/site-packages/keras/engine/training.py", line 950, in fit
batch_size=batch_size)
File "/Users/apple/Desktop/Kaggle/Taxi Trajectory/vir/lib/python3.6/site-packages/keras/engine/training.py", line 671, in _standardize_user_data
self._set_inputs(x)
File "/Users/apple/Desktop/Kaggle/Taxi Trajectory/vir/lib/python3.6/site-packages/keras/engine/training.py", line 575, in _set_inputs
assert len(inputs) == 1
AssertionError

--Getting the above error. Any help is appreciated.

@karafede
Copy link

here my solution that worked pretty well... using API KERAS functional model:

def create_model(metadata, clusters):
"""
Creates all the layers for our neural network model.
"""

# metadata = data.metadata
# Arbitrary dimension for all embeddings
embedding_dim = 10

# Quarter hour of the day embedding
i_embed_quarter_hour = Input(shape=(1,))
embed_quarter_hour = Embedding(metadata['n_quarter_hours'], embedding_dim, input_length=1)(i_embed_quarter_hour)
quarter_hour_vec = Flatten()(embed_quarter_hour)

# Day of the week embedding
i_embed_day_of_week = Input(shape=(1,))
embed_day_of_week = Embedding(metadata['n_days_per_week'], embedding_dim, input_length=1)(i_embed_day_of_week)
day_of_week_vec = Flatten()(embed_day_of_week)

# Week of the year embedding
i_embed_week_of_year = Input(shape=(1,))
embed_week_of_year=Embedding(metadata['n_weeks_per_year'], embedding_dim, input_length=1)(i_embed_week_of_year)
week_of_year_vec = Flatten()(embed_week_of_year)

# Client ID embedding
i_embed_client_ids = Input(shape=(1,))
embed_client_ids=Embedding(metadata['n_client_ids'], embedding_dim, input_length=1)(i_embed_client_ids)
client_ids_vec = Flatten()(embed_client_ids)

# Taxi ID embedding
i_embed_taxi_ids = Input(shape=(1,))
embed_taxi_ids=Embedding(metadata['n_taxi_ids'], embedding_dim, input_length=1)(i_embed_taxi_ids)
taxi_ids_vec = Flatten()(embed_taxi_ids)

# Taxi stand ID embedding
i_embed_stand_ids = Input(shape=(1,))
embed_stand_ids=Embedding(metadata['n_stand_ids'], embedding_dim, input_length=1)(i_embed_stand_ids)
stand_ids_vec = Flatten()(embed_stand_ids)

# GPS coordinates (5 first lat/long and 5 latest lat/long, therefore 20 values)
i_coords = Input((20,))
coords = Dense(1)(i_coords)


# Merge all the inputs into a single input layer

merged_layer = Concatenate()([
    quarter_hour_vec,
    day_of_week_vec,
    week_of_year_vec,
    client_ids_vec,
    taxi_ids_vec,
    stand_ids_vec,
    coords])

# Final activation layer: calculate the destination as the weighted mean of cluster coordinates
cast_clusters = K.cast_to_floatx(clusters)
def destination(probabilities):
    return tf.matmul(probabilities, cast_clusters)

i_output_layer = Dense(500, activation='relu')(merged_layer)
output_layer = Dense(len(clusters), activation='softmax')(i_output_layer)
output_layer_final = Activation(destination)(output_layer)

model = Model(inputs=[i_embed_quarter_hour,
                      i_embed_day_of_week,
                      i_embed_week_of_year,
                      i_embed_client_ids,
                      i_embed_taxi_ids,
                      i_embed_stand_ids,
                      i_coords], outputs=output_layer_final)

# Compile the model
optimizer = SGD(lr=0.01, momentum=0.9, clipvalue=1.)  # Use `clipvalue` to prevent exploding gradients
model.compile(loss=tf_haversine, optimizer=optimizer)

return model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants