Google Colab[
^]
The code link has been updated. Now you can access it here.
I was trying to implement smile detection project from kaggle on to my Google collab.
kaggle link
smile-detection | Kaggle[
^]
I have downloaded the datasets and uploaded on my drive and mounted it so my problem is:
H=model.fit(train_X,train_y,validation_data=(test_X,test_y),epochs=15,batch_size=32)
It shows this error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-7-094cecbab710> in <cell line:="" 1="">()
----> 1 H=model.fit(train_X,train_y,validation_data=(test_X,test_y),epochs=15,batch_size=28)
1 frames
/usr/local/lib/python3.10/dist-packages/keras/src/engine/data_adapter.py in __init__(self, x, y, sample_weight, batch_size, steps_per_epoch, initial_epoch, epochs, shuffle, class_weight, max_queue_size, workers, use_multiprocessing, model, steps_per_execution, distribute, pss_evaluation_shards)
1317
1318 if self._inferred_steps == 0:
-> 1319 raise ValueError("Expected input data to be non-empty.")
1320
1321 def _configure_dataset_and_inferred_steps(
ValueError: Expected input data to be non-empty.
So I tried changing the batch size to 4 16 but same error.
What I have tried:
I tried changing the batch size several times.