You can use Boltzmann's H-Theorem to compute the entropy increase or decrease with time.

Consider you have a collision term $C(f)$ for the probability Distribution function $f$, where it holds e.g. $\frac{Df}{Dt} = C(f)$. Let $s = \ln f$ (generates the entropy density); multiply the collision term by $s$ and integrate over the phase space $\Sigma$. Then you have (<> denotes the average)

$\frac{D<s>}{Dt} = \int_{\Sigma}d \sigma sC(f).$ (*)

Suppose that there exists an equilibrium probability Density $f_0$ with $C(f_0) = 0$ (no entropy production). The Distribution function depends on all of the the $i$-th Phase space variables $x_i$. From this you can expand the non-equilibrium probability Density in Terms of the Equilibrium Density by the series expansion

$f = (1+ (<x_i> - <x_i>_0)\frac{\partial}{\partial x_i} + (<x_ix_j>-<x_j><x_i>_0-<x_j>_0<x_i>$

$-<x_ix_j>_0)\frac{\partial^2}{\partial x_i \partial x_j} + \dots)f_0$ (Summation convention is used).

The $<>_0$ is averaging with Equilibrium Distribution function; These are also known. You can convince yourself, that this Expansion holds by taking various Moments of this and using that Integration of a total derivative vanishes.

Substituting this Expansion into the equation (*) gives you an Expansion of the entropy production rate in all nonequilibrium Moments $<x_i>,<x_ix_j>$. You will have a Connection between the entropy Change and some other quantities that are easy to measure.