.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/dicodile/plot_gait.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_dicodile_plot_gait.py: ==================== Gait (steps) example ==================== In this example, we use DiCoDiLe on an open dataset of gait (steps) IMU time-series to discover patterns in the data. We will then use those to attempt to detect steps and compare our findings with the ground truth. .. GENERATED FROM PYTHON SOURCE LINES 10-14 .. code-block:: Python import matplotlib.pyplot as plt import numpy as np .. GENERATED FROM PYTHON SOURCE LINES 15-16 # Retrieve trial data .. GENERATED FROM PYTHON SOURCE LINES 16-21 .. code-block:: Python from dicodile.data.gait import get_gait_data trial = get_gait_data(subject=6, trial=1) .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading data from http://dev.ipol.im/~truong/GaitData.zip (192.3 MB) file_sizes: 0%| | 0.00/202M [00:00] .. GENERATED FROM PYTHON SOURCE LINES 35-37 Let’s look at a small portion of the series for both feet, overlaid on the same plot. .. GENERATED FROM PYTHON SOURCE LINES 37-47 .. code-block:: Python fig, ax = plt.subplots() ax.plot(trial['data']['LAV'][5000:5800], label='left foot vertical acceleration') ax.plot(trial['data']['RAV'][5000:5800], label='right foot vertical acceleration') ax.set_xlabel('time (x10ms)') ax.set_ylabel('acceleration ($m.s^{-2}$)') ax.legend() .. image-sg:: /auto_examples/dicodile/images/sphx_glr_plot_gait_002.png :alt: plot gait :srcset: /auto_examples/dicodile/images/sphx_glr_plot_gait_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 48-52 We can see the alternating left and right foot movements. In the rest of this example, we will only use the right foot vertical acceleration. .. GENERATED FROM PYTHON SOURCE LINES 54-55 # Convolutional Dictionary Learning .. GENERATED FROM PYTHON SOURCE LINES 57-61 Now, let’s use "dicodile" as solver_z to learn patterns from the data and reconstruct the signal from a sparse representation. First, we initialize a dictionary from parts of the signal: .. GENERATED FROM PYTHON SOURCE LINES 61-69 .. code-block:: Python X = trial['data']['RAV'].to_numpy() # reshape X to (n_trials, n_channels, n_times) X = X.reshape(1, 1, *X.shape) print(X.shape) .. rst-class:: sphx-glr-script-out .. code-block:: none (1, 1, 18639) .. GENERATED FROM PYTHON SOURCE LINES 70-73 Note the use of reshape to shape the signal as per alphacsc requirements: the shape of the signal should be (n_trials, n_channels, n_times). Here, we have a single-channel time series so it is (1, 1, n_times). .. GENERATED FROM PYTHON SOURCE LINES 73-121 .. code-block:: Python from alphacsc.init_dict import init_dictionary # set dictionary size n_atoms = 8 # set individual atom (patch) size. n_times_atom = 200 D_init = init_dictionary(X, n_atoms=8, n_times_atom=200, rank1=False, window=True, D_init='chunk', random_state=60) print(D_init.shape) "" from alphacsc import BatchCDL cdl = BatchCDL( # Shape of the dictionary n_atoms, n_times_atom, rank1=False, uv_constraint='auto', # Number of iteration for the alternate minimization and cvg threshold n_iter=3, # number of workers to be used for dicodile n_jobs=4, # solver for the z-step solver_z='dicodile', solver_z_kwargs={'max_iter': 10000}, window=True, D_init=D_init, random_state=60) res = cdl.fit(X) "" from dicodile.utils.viz import display_dictionaries D_hat = res._D_hat fig = display_dictionaries(D_init, D_hat) .. image-sg:: /auto_examples/dicodile/images/sphx_glr_plot_gait_003.png :alt: plot gait :srcset: /auto_examples/dicodile/images/sphx_glr_plot_gait_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none (8, 1, 200) Started 4 workers in 1.64s [BatchCDL] CD iterations 0 / 3 [BatchCDL] lambda = 2.658e+00 [BatchCDL] Objective (z) : 4.270e+03 (sparsity: 3.010e-03) [BatchCDL] Resampled atom 0 [BatchCDL] Objective (d) : 3.674e+03 [BatchCDL] CD iterations 1 / 3 [BatchCDL] Objective (z) : 3.401e+03 (sparsity: 3.301e-03) [BatchCDL] Resampled atom 1 [BatchCDL] Objective (d) : 3.323e+03 [BatchCDL] CD iterations 2 / 3 [BatchCDL] Objective (z) : 3.219e+03 (sparsity: 3.301e-03) [BatchCDL] Resampled atom 5 [BatchCDL] Objective (d) : 3.153e+03 [BatchCDL] Fit in 12.0s .. GENERATED FROM PYTHON SOURCE LINES 122-123 # Signal reconstruction .. GENERATED FROM PYTHON SOURCE LINES 125-126 Now, let's reconstruct the original signal. .. GENERATED FROM PYTHON SOURCE LINES 126-133 .. code-block:: Python from alphacsc.utils.convolution import construct_X_multi z_hat = res._z_hat X_hat = construct_X_multi(z_hat, D_hat) .. GENERATED FROM PYTHON SOURCE LINES 134-135 Plot a small part of the original and reconstructed signals .. GENERATED FROM PYTHON SOURCE LINES 135-145 .. code-block:: Python fig_hat, ax_hat = plt.subplots() ax_hat.plot(X[0][0][5000:5800], label='right foot vertical acceleration (ORIGINAL)') ax_hat.plot(X_hat[0][0][5000:5800], label='right foot vertical acceleration (RECONSTRUCTED)') ax_hat.set_xlabel('time (x10ms)') ax_hat.set_ylabel('acceleration ($m.s^{-2}$)') ax_hat.legend() .. image-sg:: /auto_examples/dicodile/images/sphx_glr_plot_gait_004.png :alt: plot gait :srcset: /auto_examples/dicodile/images/sphx_glr_plot_gait_004.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 146-147 Check that our representation is indeed sparse: .. GENERATED FROM PYTHON SOURCE LINES 147-151 .. code-block:: Python np.count_nonzero(z_hat) .. rst-class:: sphx-glr-script-out .. code-block:: none 487 .. GENERATED FROM PYTHON SOURCE LINES 152-154 Besides our visual check, a measure of how closely we’re reconstructing the original signal is the (normalized) cross-correlation. Let’s compute this: .. GENERATED FROM PYTHON SOURCE LINES 154-159 .. code-block:: Python np.correlate(X[0][0], X_hat[0][0]) / np.sqrt( np.correlate(X[0][0], X[0][0]) * np.correlate(X_hat[0][0], X_hat[0][0]) ) .. rst-class:: sphx-glr-script-out .. code-block:: none array([0.98280031]) .. GENERATED FROM PYTHON SOURCE LINES 160-161 # Multichannel signals .. GENERATED FROM PYTHON SOURCE LINES 163-166 DiCoDiLe works just as well with multi-channel signals. The gait dataset contains 16 signals (8 for each foot), in the rest of this tutorial, we’ll use three of those. .. GENERATED FROM PYTHON SOURCE LINES 166-170 .. code-block:: Python # Left foot Vertical acceleration, Y rotation and X acceleration channels = ['LAV', 'LRY', 'LAX'] .. GENERATED FROM PYTHON SOURCE LINES 171-172 Let’s look at a small portion of multi-channel data .. GENERATED FROM PYTHON SOURCE LINES 172-181 .. code-block:: Python colors = plt.rcParams["axes.prop_cycle"]() mc_fig, mc_ax = plt.subplots(len(channels), sharex=True) for ax, chan in zip(mc_ax, channels): ax.plot(trial['data'][chan][5000:5800], label=chan, color=next(colors)["color"]) mc_fig.legend(loc="upper center") .. image-sg:: /auto_examples/dicodile/images/sphx_glr_plot_gait_005.png :alt: plot gait :srcset: /auto_examples/dicodile/images/sphx_glr_plot_gait_005.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 182-183 Let’s put the data in shape for alphacsc: (n_trials, n_channels, n_times) .. GENERATED FROM PYTHON SOURCE LINES 183-190 .. code-block:: Python X_mc_subset = trial['data'][channels].to_numpy().T X_mc_subset = X_mc_subset.reshape(1, *X_mc_subset.shape) print(X_mc_subset.shape) .. rst-class:: sphx-glr-script-out .. code-block:: none (1, 3, 18639) .. GENERATED FROM PYTHON SOURCE LINES 191-193 Initialize the dictionary (note that the call is identical to the single-channel version) .. GENERATED FROM PYTHON SOURCE LINES 193-201 .. code-block:: Python D_init_mc = init_dictionary( X_mc_subset, n_atoms=8, n_times_atom=200, rank1=False, window=True, D_init='chunk', random_state=60 ) print(D_init_mc.shape) .. rst-class:: sphx-glr-script-out .. code-block:: none (8, 3, 200) .. GENERATED FROM PYTHON SOURCE LINES 202-204 And run DiCoDiLe (note that the call is identical to the single-channel version here as well) .. GENERATED FROM PYTHON SOURCE LINES 204-226 .. code-block:: Python from alphacsc import BatchCDL cdl = BatchCDL( # Shape of the dictionary n_atoms, n_times_atom, rank1=False, uv_constraint='auto', # Number of iteration for the alternate minimization and cvg threshold n_iter=3, # number of workers to be used for dicodile n_jobs=4, # solver for the z-step solver_z='dicodile', solver_z_kwargs={'max_iter': 10000}, window=True, D_init=D_init_mc, random_state=60) res = cdl.fit(X_mc_subset) .. rst-class:: sphx-glr-script-out .. code-block:: none Started 4 workers in 1.62s [BatchCDL] CD iterations 0 / 3 [BatchCDL] lambda = 3.191e+00 [BatchCDL] Objective (z) : 7.983e+03 (sparsity: 5.437e-03) [BatchCDL] Objective (d) : 7.598e+03 [BatchCDL] CD iterations 1 / 3 [BatchCDL] Objective (z) : 7.473e+03 (sparsity: 4.305e-03) [BatchCDL] Objective (d) : 7.436e+03 [BatchCDL] CD iterations 2 / 3 [BatchCDL] Objective (z) : 7.404e+03 (sparsity: 3.918e-03) [BatchCDL] Objective (d) : 7.376e+03 [BatchCDL] Fit in 14.4s .. GENERATED FROM PYTHON SOURCE LINES 227-228 # Signal reconstruction (multichannel) .. GENERATED FROM PYTHON SOURCE LINES 230-231 Now, let’s reconstruct the original signal .. GENERATED FROM PYTHON SOURCE LINES 231-240 .. code-block:: Python from alphacsc.utils.convolution import construct_X_multi z_hat_mc = res._z_hat D_hat_mc = res._D_hat X_hat_mc = construct_X_multi(z_hat_mc, D_hat_mc) .. GENERATED FROM PYTHON SOURCE LINES 241-243 Let’s visually compare a small part of the original and reconstructed signal along with the activations. .. GENERATED FROM PYTHON SOURCE LINES 243-271 .. code-block:: Python z_hat_mc.shape "" viz_start_idx = 4000 viz_end_idx = 5800 viz_chan = 2 max_abs = np.max(np.abs(z_hat_mc), axis=-1) max_abs = max_abs.reshape(z_hat_mc.shape[1], 1) z_hat_normalized = z_hat_mc / max_abs fig_hat_mc, ax_hat_mc = plt.subplots(2, figsize=(12, 8)) # plot original and constructed ax_hat_mc[0].plot(X_mc_subset[0][viz_chan][viz_start_idx:viz_end_idx], label='ORIGINAL') ax_hat_mc[0].plot(X_hat_mc[0][viz_chan][viz_start_idx:viz_end_idx], label='RECONSTRUCTED') ax_hat_mc[0].set_xlabel('time (x10ms)') ax_hat_mc[0].legend() # plot activations for idx in range(z_hat_normalized.shape[1]): ax_hat_mc[1].stem(z_hat_normalized[0][idx][viz_start_idx:viz_end_idx], linefmt=f"C{idx}-", markerfmt=f"C{idx}o") .. image-sg:: /auto_examples/dicodile/images/sphx_glr_plot_gait_006.png :alt: plot gait :srcset: /auto_examples/dicodile/images/sphx_glr_plot_gait_006.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 42.663 seconds) .. _sphx_glr_download_auto_examples_dicodile_plot_gait.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gait.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gait.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gait.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_