 Research
 Open Access
 Published:
Piecewise constant martingales and lazy clocks
Probability, Uncertainty and Quantitative Riskvolume 4, Article number: 2 (2019)
Abstract
Conditional expectations (like, e.g., discounted prices in financial applications) are martingales under an appropriate filtration and probability measure. When the information flow arrives in a punctual way, a reasonable assumption is to suppose the latter to have piecewise constant sample paths between the random times of information updates. Providing a way to find and construct piecewise constant martingales evolving in a connected subset of \(\mathbb {R}\) is the purpose of this paper. After a brief review of possible standard techniques, we propose a construction scheme based on the sampling of latent martingales \(\tilde {Z}\) with lazy clocks θ. These θ are timechange processes staying in arrears of the true time but that can synchronize at random times to the real (calendar) clock. This specific choice makes the resulting timechanged process \(Z_{t}=\tilde {Z}_{\theta _{t}}\) a martingale (called a lazy martingale) without any assumption on \(\tilde {Z}\), and in most cases, the lazy clock θ is adapted to the filtration of the lazy martingale Z, so that sample paths of Z on [0,T] only requires sample paths of \(\left (\theta, \tilde {Z}\right)\) up to T. This would not be the case if the stochastic clock θ could be ahead of the real clock, as is typically the case using standard timechange processes. The proposed approach yields an easy way to construct analytically tractable lazy martingales evolving on (interval of) \(\mathbb {R}\).
Introduction
Martingales play a central role in probability theory, but also in many applications. This is specifically true in mathematical finance where it is used to model Radon—Nikodym derivative processes or discounted prices in arbitragefree market models (Jeanblanc et al. 2007). More generally, it is very common to deal with conditional expectation processes \(Z = (Z_{t})_{t \in [0,T]},\; Z_{t} := \mathbb {E}[Z_{T}\mathcal {F}_{t}]\), where \(\mathbb {F}:=(\mathcal {F}_{t})_{t \in [0,T]}\) is a reference filtration and \(\mathbb {E}\) stands for the expectation operator associated with a given probability measure \(\mathbb {P}\). Many different modeling setups have been proposed to represent the dynamics of Z (e.g., random walk, Brownian motion, Geometric Brownian motion, Jump diffusion, etc) depending on some assumptions about its range, pathwise continuity, or continuous versus discretetime setting. In many circumstances, however, information can be considered to arrive at random times, or in a partial (punctual way).
An interesting application in that respect is the modeling of quoted recovery rates. The recovery rate r of a firm corresponds to the ratio of the debt that will be recovered after the firm’s default during an auction process. It is also a major factor driving the price of corporate bonds or other derivatives instruments likes credit default swaps or credit linked notes. In many standard models (like those suggested by the International Swaps and Derivatives Association (ISDA)), the recovery rate process is assumed constant (see e.g., Markit (2004)). Many studies stressed the fact that r is in fact not a constant: it cannot be observed prior to the firm’s default τ; r is an \(\mathcal {F}_{\tau }\)measurable random variable in [0,1]. This simple observation can have serious consequences in terms of pricing and riskmanagement of credit sensitive products, and explains the development of stochastic recovery models (Amraoui et al. 2012; Andersen and Sidenius 2004). A further development in credit risk modeling is to take into account the fact that recovery rates can be “dynamized” (Gaspar and Slinko 2008). Quoted recovery rates, for instance, can thus be modeled as a stochastic process R=(R_{t})_{t≥0} that gives the “market’s view” of a firm’s recovery rate as seen from time t. Hence, \(R_{t} := \mathbb {E}[r \mathcal {F}_{t}]\) can be seen as a martingale evolving in the unit interval. By correlating R with the creditworthiness of the firm, it becomes possible to account for a wellknown fact in finance: recovery rate and default probability are statistically linked (Altman et al. 2003). However, observations for the process R are limited: updates in recovery rate quotes arrive in a scarce and random way. Therefore, in contrast with the common setup, it is more realistic to represent R as a martingale whose trajectories remain constant for long period of times, but “changes” only occasionally, upon arrival of related information (e.g., when a dealer updates its view to specialized data providers). More generally, such types of martingales could be used to model discounted price processes of financial instruments, observed under partial (punctual) information, e.g., at some random times, but also to represent price processes of illiquid products. Indeed, without additional information, a reasonable approach may consist of assuming that discounted prices remain constant between arrivals of market quotes, and jump to the level given by the new quote when a new trade is done.
Whereas discretetime and continuous martingales have been extensively studied in the literature, very little work has been done with respect to martingales having piecewise constant sample paths. In this paper, we propose a methodology to find and construct such types of martingales. The special case of stepmartingales (which are martingales with piecewise constant sample paths, but restricted to a finite number of jumps in any finite interval) have been studied in Boel et al. (1975a, b), with emphasis on representation theorems and applications to communication and control problems. In Herdegen and Herrmann (2016), the authors investigate a single jump case, in which the first part of the path (before the unique jump) is supposed to be deterministic. We extend this research in several ways. First, we relax the (strong) stepmartingale restriction and deal with the broader class of processes featuring possibly infinitely many jumps in a time interval. Second, our approach allows one to build martingales that evolve in a bounded interval, a problem that received little attention so far and which relevance is stressed with the above recovery example, but could also be of interest for modeling stochastic probabilities or correlations. This is achieved by introducing a new class of timechange processes called lazy clocks. Finally, we provide and study numerous examples and propose some construction algorithms.
The paper is organized as follows. In Section 2 we formally introduce the concept of piecewise constant martingalesZ and presents several routes to construct these processes. We then introduce a different approach in Section 3, where (Z_{t})_{t≥0} is modeled as a timechanged process \(\left (\tilde {Z}_{\theta _{t}}\right)_{t \geq 0}\), where θ is a lazy clock. The latter are timechange processes built in such a way that the stochastic clock always stays in arrears of the real clock (θ_{t}≤t a.s.). This condition is motivated by computational considerations: it guarantees that sampling Z over a fixed time horizon [0,T] only requires the sampling of \(\left (\tilde {Z},\theta \right)\) over the same period. Finally, as our objective is to provide a workable methodology, we derive the analytical expression for the distributions and moments in some particular cases, and provide efficient sampling algorithms for the simulations of such martingales.
Piecewise constant martingales
In the literature, pure jump processes defined on a filtered probability space \((\Omega,\mathcal {F}, \mathbb {F}, \mathbb {P})\), where \(\mathbb {F}=(\mathcal {F}_{t})_{t\in [0,T]}\) and \(\mathcal {F}:=\mathcal {F}_{T}\), are often referred to as stochastic processes having no diffusion part. In this paper we are interested in a subclass of pure jump processes: piecewise constant (PWC) martingales defined as follows.
Definition 1
(Piecewise constant martingale) A piecewise constant \(\mathbb {F}\)martingale Z is a càdlàg \(\mathbb {F}\)martingale whose jumps \(\Delta Z_{s}=Z_{s}Z_{s^{}}\phantom {\dot {i}\!}\) are summable (i.e. \(\sum _{s\leq T} \Delta Z_{s}<+\infty \) a.s.) and such that for every t≥0 :
In particular, the sample paths of Z(ω) for ω∈Ω belong to the class of piecewise constant functions of time.
Note that an immediate consequence of this definition is that a PWC martingale has finite variation. Such type of processes may be used to represent martingales observed under partial (punctual) information, e.g. at some (random) times. One possible field of application is mathematical finance, where discounted price processes are martingales under an equivalent measure. Without additional information, a reasonable approach may consist in assuming that discounted prices remain constant between arrivals of market quotes, and jump to the level given by the new quote when a new trade is done. More generally, this could represent conditional expectation processes (i.e. “best guess”) where information arrives in a discontinuous way.
Most of the “usual” martingales with no diffusion term fail to have piecewise constant sample paths. For example, Azéma’s first martingale M=(M_{t})_{t≥0} defined as
where W is a Brownian motion, is essentially piecewise squareroot. Interestingly, one can show that \(M_{t}=\mathbb {E}\left [W_{t}\mathcal {F}_{g_{t}^{0}(W)+} \right ]\), so that M actually corresponds to the projection of W onto its slow filtration, see e.g. Dellacherie et al. (1992); Mansuy and Yor (2006) and Chapter IV, section 8 of Protter (2005) for a detailed analysis of this process. Similarly, the Geometric Poisson Process \(\mathrm {e}^{N_{t}\log (1+\sigma)\lambda \sigma t}\) is a positive martingale with piecewise negative exponential sample paths (Shreve 2004, Ex 11.5.2).
However, finding such type of processes is not difficult. We provide below three different methods to construct some of them. Yet, not all are equally powerful in terms of tractability. The last method proves to be quite appealing in that it yields analytically tractable PWC martingales whose range can be any connected set.
An autoregressive construction scheme
We start by looking at a subset of PWC martingales, namely stepmartingales. These are martingales whose paths belong to the space of step functions on any bounded interval, i.e. whose paths are a finite linear combination of indicator functions of intervals. As a consequence, a step martingale Z admits a finite number of jumps on [0,T] taking places at, say (τ_{k}, k≥1), and may be decomposed as (with τ_{0}=0)
Looking at such decomposition, we see that step martingales may easily be constructed by an autoregressive scheme.
Proposition 1
Let Z be a càdlàg process with integrable variation starting from Z_{0}. We assume that \(\mathbb {E}[Z_{0}]<+\infty \). Then, the following are equivalent:

Z is a step martingale with respect to its natural filtration \(\mathbb {F}\),

there exists a strictly increasing sequence of random times (τ_{k}, k≥0) starting from τ_{0}=0, taking values in [0,+∞] and with no point of accumulation such that
$$ Z_{t} := Z_{0}+\sum\limits_{k=1}^{+\infty} \left(Z_{\tau_{k}}Z_{\tau_{k1}}\right)1_{\{\tau_{k} \leq t\}} $$(2)and which satisfies for any 0≤s≤t :
$$ \sum\limits_{k=1}^{+\infty} \mathbb{E}\left[\left. \left(Z_{\tau_{k}}Z_{\tau_{k1}}\right) 1_{\{s< \tau_{k}\leq t\}} \right\mathcal{F}_{s}\right]=0\;. $$(3)Furthermore, the filtration \(\mathbb {F}\) is given for s≥0 by \(\mathcal {F}_{s} = \sigma \left (\left (Z_{\tau _{k}}, \tau _{k}\right.\right) \text { for } \left. k\geq 0 \text { such that} \tau _{k}\leq s\right)\).
Proof
i) →ii) Let Z be a stepmartingale with respect to its natural filtration \(\mathbb {F}\), and denote by (τ_{k},k≥0) the sequence of its successive jumps, with τ_{0}=0. If Z only admits a finite number of jumps n_{0}, then we set τ_{n}=+∞ for n>n_{0}. This choice of the sequence (τ_{k},k≥0) implies that the filtration \(\mathbb {F}\) equals
and that we have the representation :
Taking the expectation with respect to \(\mathcal {F}_{s}\) with 0≤s≤t on both sides and applying Fubini’s theorem (since Z is of integrable variation), we deduce that :
which implies that the second sum on the righthand side is null.
ii) →i) Define
and observe that since the sequence (τ_{k},k≥0) has no point of accumulation, Z is clearly a step process. Furthermore, since Z is of integrable variation, we have
which prove that Z_{t} is integrable for any t≥0. Finally, as in the first part of the proof, taking the expectation with respect to \(\mathcal {F}_{s}\) in (2) and using (3), we deduce that :
which proves that Z is indeed a martingale. □
Corollary 1
Let M be a martingale of integrable variation, and let (τ_{k},k≥0) be an increasing sequence of random times starting from τ_{0}=0, taking values in (0,+∞], with no point of accumulation and which is independent of M. Then, if Z_{0} is an integrable random variable, the process
is a step martingale with respect to the filtration \(\mathbb {F}\) given, for s≥0, by \(\mathcal {F}_{s} = \sigma \left (\left (M_{\tau _{k}}, \tau _{k}\right) \text { for }k\geq 1 \text { such that } \tau _{k}\leq s\right)\).
Proof
We only need to check that (3) is satisfied, which is a consequence of the tower property of conditional expectations. Indeed, define the larger filtration \(\mathcal {G}_{s}=\sigma ((\tau _{i}, i\geq 1),\, (M_{u}, u\leq s))\) and observe that on the set {s<τ_{k}≤t} :
Then, since M is a \(\mathbb {G}\)martingale :
□
Remark 1
Observe that the natural filtration of Z in Proposition 1 satisfies the identity \(\mathcal {F}_{t} = \mathcal {F}_{\tau _{k}}\) on the set {τ_{k}≤t<τ_{k+1}}. Since the random times (τ_{k},k≥0) are stopping times in the filtration \(\mathbb {F}\), this implies that \(\mathbb {F}\) is a jumping filtration, following the definition of JacodSkorokhod (1994).
Example 1
Let N be a counting process and let \(\tau _{1},\ldots,\tau _{N_{t}}\phantom {\dot {i}\!}\) be the sequence of jump times of N on [0,t] with τ_{0}:=0. If (Y_{k},k≥1) is a family of independent and centered random variables, independent from N, then
is a PWC martingale. Note that we may choose the range of such a PWC martingale by taking bounded random variables. For instance, if Z_{0}=0 and for any k≥1,
with a<0<b, then for any t≥0, we have Z_{t}∈[a,b] a.s.
The above corollary provides us with a simple method to construct PWC martingales. Yet, it suffers from two restrictions. First, the distribution of Z_{t} requires averaging the conditional distribution with respect to the counting process, which may be an infinite sum. Second, a control on the range of the resulting martingale requires strong assumptions. One might try to relax the i.i.d. assumption of the Y_{k}’s. In Example 1 the Y_{k}’s are independent but their support decreases as 1/k^{2}. One could also draw Y_{k} from a distribution whose support is state dependent like \(\left [ aZ_{\tau _{k1}}, bZ_{\tau _{k1}} \right ]\), then Z_{t}∈[a,b] for all t∈[0,T]. In the sequel, we address these drawbacks by proposing another construction scheme.
PWC martingales from pure jump martingales with vanishing compensator
Pure jump martingales can easily be obtained by taking the difference of a pure jump increasing process with a predictable, grounded, rightcontinuous process of bounded variation (called dual predictable projections or predictable compensator). The simplest example is probably the compensated Poisson process of parameter λ defined by (M_{t}=N_{t}−λt, t≥0). This process is a pure jump martingale with piecewise linear sample paths, hence is not a PWC martingale as \(\sum _{s\leq t} \Delta M_{s} = N_{t} \neq M_{t}\). However, it is easy to see that the difference of two Poisson processes with the same intensity is a PWC martingale (in fact a stepmartingale), and we shall generalize this idea in the following proposition.
Proposition 2
A \(\mathbb {F}\)martingale of integrable variation Z (with Z_{0}=0) is PWC if and only if there exist two \(\mathbb {F}\)adapted, integrable and increasing pure jump processes A and B having the same dual predictable projections (i.e A^{p}=B^{p}) such that Z=A−B.
Proof
Assume first that Z is a PWC martingale of integrable variation. We define
Then A and B are two increasing pure jump processes such that Z=A−B. They are integrable since Z is of integrable variation, and they satisfy A^{p}−B^{p}=(A−B)^{p}=Z^{p}=0 which proves the first implication.
Assume now that A and B are pure jump increasing processes. We then have the representation
By the triangular inequality, we may check that
hence Z is integrable and of integrable variation. Finally, by definition of the predictable dual projections, the processes A−A^{p} and B−B^{p} are martingales, hence by difference, so is Z since A^{p}=B^{p}. □
An easy application of this result is the case of Lévy processes, for which the compensators are deterministic functions.
Corollary 2
Let A,B be two Lévy processes having the same Lévy measure ν, and consider a measurable function f such that f(0)=0 and \(\int _{\mathbb {R}} f(x) \nu (dx)<+\infty \). Then the process
is a PWC martingale.
Proof
The proof follows from the fact that the compensator of \( \left (\sum _{s\leq t} f(\Delta A_{s}), t \geq 0 \right)\) is the deterministic process \(\left (t\int _{\mathbb {R}} f(x) \nu (dx), t \geq 0 \right) \). □
Remark 2
A centered Lévy process Z is a PWC martingale if and only if it has no drift, no Brownian component and its Lévy measure ν satisfies the integrability condition \(\int _{\mathbb {R}} x \nu (dx)<\infty \).
As obvious examples, one can mention the difference of two independent Gamma or Poisson processes of same parameters. Note that stable subordinators are not allowed here, as they do not fulfill the integrability condition required on A and B. We give below the PDF of these two examples :
Example 2
Let N^{1},N^{2} be two independent Poisson processes with parameter λ. Then, Z:=N^{1}−N^{2} is a step martingale taking integer values, with marginal laws given by the Skellam distribution with parameters μ_{1}=μ_{2}=λt :
where I_{k} is the modified Bessel function of the first kind.
Example 3
Let γ^{1},γ^{2} be two independent Gamma processes with parameters a,b>0. Then, Z:=γ^{1}−γ^{2} is a PWC martingale with marginals given by
where K_{β} denotes the modified Bessel function of the second kind with parameter \(\beta \in \mathbb {R}\).
Proof
The PDF of Z_{t} is given, for 2at>1, by the inverse Fourier transform, see Gradshteyn and Ryzhik (2007, p. 349 Formula 3.385(9)) :
The result then follows by analytic continuation. □
We conclude this section with an example of PWC martingale which does not belong to the family of Lévy processes but has the interesting feature to evolve in a timedependent range.
Corollary 3
Let R^{1},R^{2} be two squared Bessel processes of dimension δ∈(0,2). For i=1,2 set
Then, Z:=g^{0}(R^{1})−g^{0}(R^{2}) is a 1selfsimilar PWC martingale which evolves in the cone {[−t,t],t≥0}.
Proof
Let R be a squared Bessel processes of dimension δ∈(0,2) and denote by L^{0}(R) its local time at 0 as given by Tanaka’s formula. Set
In Rainer (1996, Prop. 4.1 and 6.2.1), it is proven that the process \( X = Y  \frac {1}{2^{2\frac {\delta }{2}} \Gamma \left (2\frac {\delta }{2} \right)} L^{0}(R)\) is a martingale with respect to the slow filtration \(\left (\mathcal {F}_{g_{t}^{0}+}, t \geq 0 \right)\). We shall prove that
is also a martingale in the same filtration. Notice first that since the random variable \(g_{t}^{0}(R)\) follows the generalized Arcsine law (see Section 3.1.2 below), the expectation of this process is constant and equal 0. We then apply Itô’s formula to Y with the function \(f(y) = y^{\frac {2}{2\delta }}\) :
Observe next that the instants of jumps of Y are the same as those of g^{0}(R), i.e. \(\left \{s; Y_{s}\neq Y_{s^{}} \right \} = \left \{ s; \, g_{s}^{0}(R) \neq g_{s^{}}^{0}(R) \right \}\). But, the jumps of g^{0}(R) only happen at times s when R_{s}=0, in which case \(g_{s}^{0}(R)=s\) or equivalently Y_{s}=0. This yields the simplifications :
and it remains to prove that the stochastic integral is a martingale. Since the support of dL^{0}(R) is included in {s; R_{s}=0}⊂{s;Y_{s}=0}, and L^{0}(R) is continuous, we deduce that
hence the process
is a local martingale. To prove that it is a true martingale, choose an horizon T and observe that the process
is now a positive local martingale, hence a supermartingale with constant expectation, hence a true martingale. Finally, the selfsimilarity of \(g_{t}^{0}(R)\) comes from that of R (see Revuz—Yor1999, Proposition 1.6, p. 443). Indeed, for any fixed t>0 :
□
Remark 3
When δ=1, we have X=W^{2} where W is a standard Brownian motion. Using Lévy Arcsine law, the PDF of Z_{1} is given by the convolution, for z∈[0,1] :
where F denotes the incomplete elliptic integral of the first kind, see Gradshteyn and Ryzhik (2007, p. 275, Formula 3.147(5)). This yields, by symmetry and scaling :
Both the recursive and the vanishing compensators approaches are rather restrictive in terms of attainable range and analytical tractability. In the next subsection, we provide a more general method that can be used to build PWC martingales to any connected set of \(\mathbb {R}\) in a simple and tractable way.
PWC martingales using timechanged techniques
In this section, we construct a PWC martingale Z by timechanging a latent (\(\mathbb {P}, \mathbb {F}\))martingale \(\tilde {Z}=\left (\tilde {Z}_{t}\right)_{t\geq 0}\) with the help of a suitable timechange process θ.
Definition 2
(time change process) A \(\mathbb {F}\)time change process θ=(θ_{t})_{∈[0,T]} is a stochastic process satisfying

θ_{0}=0,

for any t∈[0,T], θ_{t} is \(\mathcal {F}_{t}\)measurable (i.e. θ is adapted to the filtration \(\mathbb {F}\)),

the map t↦θ_{t} is càdlàg a.s. nondecreasing.
Under mild conditions stated below, \(Z:=\left (\tilde {Z}_{\theta _{t}}\right)_{t\geq 0}\) is proven to be a martingale with respect to its own filtration, with the desired piecewise constant behavior. Most results regarding timechanged martingales deal with continuous martingales timechanged with a continuous process (Cont and Tankov 2004;Jeanblanc et al. 2007;Revuz and Yor 1999). This does not provide a satisfactory solution to our problem as the resulting martingale will obviously have continuous sample paths. On the other hand, it is obvious that not all timechanged martingales remain martingales, so that conditions are required on \(\tilde {Z}\) and/or on θ.
Remark 4
Every \(\mathbb {F}\)martingale timechanged with a \(\mathbb {F}\)adapted process remains a semimartingale but not necessarily a martingale. For instance, setting \(\tilde {Z} = W\) and θ_{t}= inf{s:W_{s}>t} then \(\tilde {Z}_{\theta _{t}} = t\). Also, if θ is independent from \(\tilde {Z}\), then the martingale property is always satisfied, but Z may fail to be integrable. For example if \(\tilde {Z}=W\) and θ is an independent αstable subordinator with α=1/2 then the timechanged process Z is not integrable: \(\mathbb {E} \left [ \left \tilde {Z}_{\theta _{t}} \right ~\theta _{t} \right ] = \sqrt {\frac {2}{\pi }} \sqrt {\theta _{t}}\) and \(\mathbb {E} \left [ \sqrt {\theta _{t}} \right ] \) is undefined. The proposition below gives sufficient conditions for Z to be integrable.
Proposition 3
Let \(\tilde {Z}\) be a martingale, and θ be a timechange process independent from \(\tilde {Z}\). We assume that θ has PWC paths and that one of the following assumptions hold :

1.
\(\tilde {Z}\) is a positive martingale,

2.
\(\tilde {Z}\) is uniformly integrable,

3.
there exists an increasing function k such that θ_{t}≤k(t) a.s. for all t.
Then \(Z: = \left (\tilde {Z}_{\theta _{t}}\right)_{t\geq 0}\) is a martingale with respect to its natural filtration.
Proof
We first check that Z is integrable.

1.
When \(\tilde {Z}\) is a positive martingale, we have \(\mathbb {E} [ Z_{t}  ] = \mathbb {E} \left [ \tilde {Z}_{\theta _{t}} \right ] = \mathbb {E}[Z_{0}] < + \infty \).

2.
When \(\tilde {Z}\) is uniformly integrable, we have
\(\mathbb {E}[Z_{t}] = \mathbb {E} \left [ \left  \tilde {Z}_{\theta _{t}} \right  \right ] \leq \mathbb {E} \left [\left \tilde {Z}_{\infty }\right \right ] < + \infty \).

3.
When θ_{t}≤k(t) a.s. for all t, we have \(\mathbb {E}[Z_{t}] = \mathbb {E} \left [ \left  \tilde {Z}_{\theta _{t}} \right  \right ] \leq \mathbb {E} \left [ \left  \tilde {Z}_{k(t)} \right  \right ] < +\infty.\)
Next, to prove the martingale property, define the larger filtration \(\mathbb {G}\) given for s≥0 by \(\mathcal {G}_{s} = \sigma \left (\left (\theta _{u}, u\geq 0 \right), \left (\tilde {Z}_{u}, u\leq \theta _{s}\right)\right.\). Applying the tower property of conditional expectation with 0≤s≤t, we obtain :
where the second equality follows from the independence between \(\tilde {Z}\) and θ. Finally, since θ has PWC paths, so does Z :
which ends the proof. □
From a practical point of view, general timechanged processes θ that are unbounded on [0,T] may cause some problems. Indeed, to simulate sample paths for Z on [0,T], one needs to simulate sample paths for \(\tilde {Z}\) on [0,θ_{T}]. This is annoying as θ_{T} can take arbitrarily large values. Hence, the class of time changed processes θ that are bounded by some function k on [0,T] for any T<∞ whilst preserving analytical tractability prove to be quite interesting. This is of course violated by most of the standard time change processes (e.g. integrated CIR, Poisson, Gamma, or Compounded Poisson subordinators). A naive alternative consists in capping the later but this would trigger some difficulties. For instance, using θ_{t}=N_{t}∧t where N is a Poisson process would mean that Z=Z_{0} before the first jump of N, but then the resulting process may have linear pieces (hence not be piecewise constant). There exist however simple time change processes θ satisfying sups∈[0,t]θ_{s}≤k(t) for some functions k bounded on any closed interval and being piecewise constant, having stochastic jumps and having a nonzero possibility to jump in any time set of nonzero measure. Building PWC martingales using such type of processes is the purpose of next section.
Lazy martingales
We first present a stochastic timechanged process that satisfies this condition in the sense that the calendar time is always ahead of the stochastic clock that is, satisfies the boundedness requirement of the above lemma with the linear boundary k(t)=t. We then use the later to create PWC martingales.
Lazy clocks
We would like to define stochastic clocks that keep time frozen almost everywhere, can jump occasionally, but can’t go ahead of the real clock. Those stochastic clocks would then exhibit the piecewise constant path and the last constraint has the nice feature that any stochastic process Z adapted to \(\mathbb {F}\) is also adapted to \(\mathbb {F}\) enlarged with the filtration generated by θ. In particular, we do not need to know the value of Z after the real time t. As far as Z is concerned, only the sample paths of Z (in fact \(\tilde {Z}\)) up to θ_{t}≤t matters. In the sequel, we consider a specific class of such processes, called lazy clocks hereafter, that have the specific property that the stochastic clock typically “sleeps” (i.e. is “on hold”), but gets synchronized to the calendar time at some random times.
Definition 3
(lazy clock) The stochastic process \(\theta : \mathbb {R}^{+} \rightarrow \mathbb {R}^{+},~t \mapsto \theta _{t}\) is a \(\mathbb {F}\)lazy clock if it satisfies the following properties

it is an \(\mathbb {F}\)time change process: in particular, it is grounded (θ_{0}=0), càdlàg and nondecreasing;

it has piecewise constant sample paths : \(\theta _{t} = \sum _{s\leq t} \Delta \theta _{s}\);

it can jump at any time and, when it does, it synchronizes to the calendar clock, i.e. there is the equality \(\{s>0 ;\, \theta _{s}\neq \theta _{s^{}}\} = \{s>0; \,\theta _{s} =s\}.\phantom {\dot {i}\!}\)
In the sense of this definition, Poisson and Compound Poisson processes are examples of subordinators that keep time frozen almost everywhere but are not lazy clocks however as nothing constraints them to reach the calendar time at each jump time (i.e., they do not satisfy θ_{τ}=τ at every jump time τ). Neither are their capped versions as there are some intervals during which θ cannot jump or grows linearly.
Remark 5
Note that for each t>0, the random variable θ_{t} is a priori not a \(\mathbb {F}\)stopping time. By contrast, if \(\mathbb {F}\) is rightcontinuous, the first passage time of the stochastic process θ beyond a given level is a stopping time. More precisely, the sequence (C_{t}, t≥0),
is an increasing family of \(\mathbb {F}\)stopping times. Conversely, for every t≥0, the lazy clock θis a family of \((\mathcal {F}_{C_{s}},\,s\geq 0)\)stopping times, see RevuzYor (Revuz and Yor 1999, Chapter V, Prop.(1.1)).
In the following, we show that lazy clocks are essentially linked with last passage times, as illustrated in the next proposition.
Proposition 4
A process θ is a \(\mathbb {F}\)lazy clock if and only if there exists a càdlàg process A starting from 0, adapted to \(\mathbb {F}\), such that the set \(\mathcal {Z}:=\{s;\, A_{s^{}}=0\text { or }A_{s}=0\}\) has a.s. zero Lebesgue measure and θ=g with
Proof
If θ is a lazy clock, then the result is immediate by taking A_{t}=θ_{t}−t which is càdlàg, and whose set of zeroes coincides with the jumps of θ, hence is countable. Conversely, fix a scenario ω∈Ω. Since A is càdlàg, the set \(\mathcal {Z}(\omega)\,=\,\{s; A_{s^{}}(\omega)\,=\,0 \text {or }A_{s}(\omega)=0\}\) is closed, hence its complementary may be written as a countable union of disjoint intervals. We claim that
Indeed, observe first that since s↦g_{s}(ω) is increasing, its has a countable number of discontinuities, hence the union on the right hand side is countable. Furthermore, the intervals which are not empty are such that A_{s}(ω)=0 or \(\phantom {\dot {i}\!}A_{s^{}}(\omega)=0\) and g_{s}(ω)=s. In particular, if s_{1}<s_{2} are associated with non empty intervals, then \(\phantom {\dot {i}\!}g_{s_{1}}(\omega)=s_{1} \leq g_{s_{2}^{}}(\omega)\phantom {\dot {i}\!}\) which proves that the intervals are disjoint.
Now, let \(u\in \mathcal {Z}^{c}(\omega)\). Then A_{u}(ω)≠0. Define \(d_{u}(\omega) = \inf \{s\geq u, \,A_{s^{}}(\omega)=0 \text {or } A_{s}(\omega)=0\}\). By rightcontinuity, d_{u}(ω)>u. We also have \(A_{u^{}}(\omega)\neq 0\phantom {\dot {i}\!}\) which implies that g_{u}(ω)<u. Therefore, u∈]g_{u}(ω),d_{u}(ω)[ which is non empty, and this may also be written \(u\in ]g_{d_{u}^{}(\omega)}(\omega),\, g_{d_{u}(\omega)}(\omega)[\) which proves the first inclusion. Conversely, it is clear that if \(\phantom {\dot {i}\!}u\in ]g_{s^{}}(\omega), g_{s}(\omega)[\), then A_{u}(ω)≠0 and \(\phantom {\dot {i}\!}A_{u^{}}(\omega)\neq 0\). Otherwise, we would have \(\phantom {\dot {i}\!}u=g_{u}(\omega)\leq g_{s^{}}(\omega)\) which would be a contradiction. Equality (6) is thus proved. Finally, it remains to write :
since \(\mathcal {Z}\) has zero Lebesgue measure. □
Remark 6

1.
Note that lazy clocks are naturally involved with PWC martingales. Indeed, if M is a PWC martingale, then \(M_{t} = M_{g_{t}(M)}\phantom {\dot {i}\!}\) where g_{t}(M)= sup{s≤t, ΔM_{s}≠0} is a lazy clock.

2.
If \(\mathbb {G}\) denotes the natural filtration of the process A, then, following the definition in DellacherieMeyer (Dellacherie et al. 1992, Chapter XX, section 28), we see that θ is adapted to the slow filtration \((\mathcal {G}_{g_{t}+})_{t\geq 0}\).

3.
It was observed in Remark 5 that lazy clocks are, in general, not stopping times. \(\mathbb {F}\)lazy clocks are however \(\mathbb {F}\)honest times, see e.g. Aksamit and Jeanblanc (2017);Mansuy and Yor (2006)^{Footnote 1}. To see this, observe first that when s≥t, θ_{t} is obviously \(\mathcal {F}_{s}\)measurable. Consider now the case s<t. Conditionally on the event {g_{t}<s}, we have g_{t}<s<t. By definition, the lazy clock takes a constant value on [g_{t},t), leading to g_{t}=g_{s}. Therefore g_{t} is (conditionally) \(\mathcal {F}_{s}\)measurable in this case as well. This shows that g_{t} is an honest time. Observe that honest times are known to be closely linked with last passage times. In this specific context, the connection is given in Proposition 4.

4.
The natural filtration of a lazy clock is called a lazy filtration, by extension of the slow filtration.
We give below a few examples of lazy clocks related to last passage times prior a given time t, whose PDF is known explicitly. Whereas some of these random variables (and corresponding distributions) have been studied in the literature, we use last passage times as clocks, i.e. in a dynamic way, as stochastic processes evolving with t.
Poisson lazy clocks
Let (X_{n},n≥1) be strictly positive random variables and consider the counting process N:=(N_{t})_{t≥0} defined as
Then the process (g_{t}(N),t≥0) defined as the last jump time of N prior to t or zero if N did not jump by time t:
is a lazy clock. Its cumulative distribution function (CDF) is easily given, for s≤t, by \(\mathbb {P}(g_{t}(N) \leq s) = \mathbb {P}(N_{t}=N_{s})\). If N is a Poisson process with intensity λ, i.e. when the random variables (X_{k}, k≥1) are i.i.d. with an exponential distribution of parameter λ, we obtain in particular \(\mathbb {P}(g_{t}(N)\leq s) = e^{\lambda (ts)}\), seeVrins (2016) for similar computations. Sample paths are shown on Fig. 1.
Diffusion lazy clock
Another simple example is given by the last passage time \(g_{t}^{a}(X)\) of a diffusion X to some level a before time t. Its CDF may be written, applying the Markov property :
where T_{a}= inf{u≥0: X_{u}=a}.

Let \(b\in \mathbb {R}\) and consider the drifted Brownian motion (X_{t})_{t≥0}, X_{t}:=B_{t}−bt. Then, the probability density function (PDF) of \(g_{t}^{a}(Bb)\) is given by (see for instance Salminen (1988) or Kahale (2008) :
$$f_{g^{a}_{t}(Bb)}(s)\! =\! \frac{\phi\left(\frac{a+bs}{\sqrt{s}}\right)} {\sqrt{s}}\! \left(\frac{2}{\sqrt{ts}} \phi \left(b\sqrt{ts} \right) + 2b \Phi \left(b\sqrt{ts} \right)\!  b \right)\,, \; 0< s< t $$where Φ denotes the standard Normal CDF Φ and Φ^{′}=ϕ. Note that when a≠0, the distribution of \(g_{t}^{a}(Bb)\) may have a mass at 0, see Shreve (2004, Corollary 7.2.2).

Let R be a Bessel process with dimension δ∈(0,2) and set \(\nu =\frac {\delta }{2}1\). Then, the PDF of \(g_{t}^{0}(R)\) is given by the generalized Arcsine law (see Gradinaru et al. (1999)) :
$$f_{g_{t}^{0}(R)}(s) = \frac{1}{\Gamma(\nu)\Gamma(1+\nu)} (ts)^{\nu} s^{1\nu}~~,~~ 0< s< t \;. $$
Stable lazy clock
The generalized Arcsine law also appears when dealing with stable Lévy processes L with parameter α∈(1,2]. Then, from Bertoin (1996, Chapter VIII, Theorem 12), the PDF of \(g_{t}^{0}(L)\) is given by :
Timechanged martingales with lazy clocks
In this section we introduce lazy martingales. A lazy martingale Z is defined as a stochastic process obtained by timechanging a latent martingale \(\tilde {Z}\) with an independent lazy clock θ. Lazy martingales \(Z=\left (\tilde {Z}_{\theta _{t}}\right)_{t\geq 0}\) are expected to be PWC martingales; this is proven in Theorem 1 below. Note that from Point 3) of Proposition 3, the process Z is always a martingale, i.e. no assumption are needed on \(\tilde {Z}\).
We first show that (in most situations) the lazy clock is adapted to the filtration generated by Z. This is done by observing that the knowledge of θ amounts to the knowledge of its jump times, since the size of the jumps are always obtained as a difference with the calendar time. In particular, the properties of the lazy clocks allow one to reconstruct the trajectories of Z on [0,t] only from past values of \(\tilde {Z}\) and θ; no information about the future (measured according to the real clock) is required. We then provide the resulting distribution when the clock g(N) is governed by Poisson, inhomogeneous Poisson or Cox processes.
Theorem 1
Let \(\tilde {Z}\) be a martingale independent from the lazy clock θ. Then \(Z = \tilde {Z}_{\theta }\) is a PWC martingale in its natural filtration \(\mathbb {F}\). If furthermore \(\mathbb {F}\) is assumed to be complete and if \(\forall u\neq v,\; \mathbb {P} \left (\tilde {Z}_{u} =\tilde {Z}_{v} \right) = 0\), then, θ is adapted to the filtration of Z.
Proof
Since by definition θ_{t}≤t for any t≥0, we first deduce from Point 3) of Proposition 3 that the process Z is a PWC martingale. Then, the fact that θ is adapted to the natural filtration of Z follows from the identity
Indeed, observe that the set \(\mathcal {N} = \{0< s\leq t; \,Z_{s}= Z_{s^{}} \text { and } \theta _{s}=s\}\) is of measure zero since, using the independence between Z and θ,
thanks to the assumption \(\forall u\neq v,\; \mathbb {P} \left (\tilde {Z}_{u} =\tilde {Z}_{v} \right) = 0\). Therefore, we have
and taking the supremum on both sides and using Point 3) in the definition of a lazy clock, we deduce that \(\theta _{t} = \sup \left \{s\leq t;\;Z_{s}\neq Z_{s^{}} \right \}\) a.s., which proves that θ is adapted to the natural filtration of Z since \(\mathbb {F}\) is complete. □
Example 4
Let \(\tilde {Z}\) be a continuous martingale and N an independent Poisson process with intensity λ. Then, Z=(Z_{t})_{t≥0} defined as \(Z_{t}:=\tilde {Z}_{g_{t}(N)}\) is a rightcontinuous PWC martingale in its natural filtration with same range as Z. Moreover, its CDF is given by
This result follows from the example of Subsection 3.1.1, using the independence assumption between \(\tilde {Z}\) and N :
A similar result applies when N is a Cox process, i.e. an inhomogeneous Poisson process whose intensity λ:=(λ_{t})_{t≥0} is an independent (positive) stochastic process.
Corollary 4
Let N be Cox process independent from \(\tilde {Z}\) and define \(P(s,t):=\mathbb {E} \left [ e^{(\Lambda _{t}\Lambda _{s})} \right ]\) where \(\Lambda _{t}: = \int _{0}^{t}\lambda _{u} du\). Then,
Proof
If λ is deterministic, i.e. in the inhomogeneous Poisson case, a direct adaptation of Example 4 yields the expression
Now, the Cox case may be obtained from the inhomogeneous Poisson case by conditioning with respect to the (independent) stochastic intensity. Indeed, applying the tower property of conditional expectation:
where in the second line we have used the independence between λ and \(\tilde {Z}\), and in the last equality Tonelli’s theorem to exchange the integral and expectation operators when applied to nonnegative functions. Finally, from Leibniz rule, \(\lambda _{s} e^{(\Lambda _{t}\Lambda _{s})}=\frac {d}{ds} e^{(\Lambda _{t}\Lambda _{s})}\) so
□
Remark 7
Notice that P(s,t)does not correspond to the expectation of \(e^{\int _{s}^{t} \lambda _{u} du}\) conditional upon \(\mathcal {F}_{s}\), the filtration generated by λ up to s as often the case e.g. in mathematical finance. It is an unconditional expectation that can be evaluated with the help of the tower law. In the specific case where λ is an affine process, for example if \(\mathbb {E} \left [e^{\int _{s}^{t} \lambda _{u} du} \lambda _{s} = x \right ]\) takes the form A(s,t)e^{−B(s,t)x} for some deterministic functions A, B, then
where \(\varphi _{\lambda _{s}}(u): = \mathbb {E} \left [ e^{iu\lambda _{s}} \right ]\) denotes the characteristic function of the random variable λ_{s}.
Example 5
In the case λ follows a CIR process, i.e. if \(d \lambda _{t} = k (\theta \lambda _{t}) dt + \sigma \sqrt {\lambda _{t}} dW_{t}\) with λ_{0}>0 then λ_{s} has the same law as r_{s}/c_{s} where c_{s}=ν/(θ(1−e^{−ks})) and r_{s} is a noncentral chisquared random variable with noncentrality parameter ν=4kθ/σ^{2} and κ_{s}=c_{s}λ_{0}e^{−ks} degrees of freedom. In this case, \(\varphi _{\lambda _{s}}(u) = \mathbb {E} \left [ \mathrm {e}^{i(u/c_{s}) r_{s}} \right ] = \varphi _{r_{s}} (u/c_{s})\) where \(\varphi _{r_{s}}(v) = \frac {\exp \left (\frac {\nu iv}{12iv} \right)} {(12iv)^{\kappa _{s}/2}}\).
Some lazy martingales without independence assumption
We have seen that when \(\tilde {Z}\) is a martingale and θ an independent lazy clock, then \(\left (Z_{t}=\tilde {Z}_{\theta _{t}}, \, t \geq 0\right)\) is a PWC martingale in its natural filtration. We now give an example where the lazy clock θ is not independent from the latent process \(\tilde {Z}\).
Proposition 5
Let B and W be two correlated Brownian motions with coefficient ρ and f a continuous function. Define the lazy clock :
Let h(W) be a progressively measurable process with respect to the natural filtration of W and such that \(\mathbb {E} \left [ \int _{0}^{t} h^{2}_{u}(W) du \right ] <+\infty \) a.s. for any t≥0. Assume that there exists a deterministic function ψsuch that:
Then, the process \(Z = \left (\int _{0}^{g_{t}^{f}(W)} h_{u}(W) dB_{u} \rho \psi \left (g_{t}^{f}(W) \right),\; t \geq 0 \right)\) is a lazy martingale in its natural filtration.
Proof
Let β be a Brownian motion independent from W such that \(B = \rho W+\sqrt {1\rho ^{2}}\,\beta \). We first write:
Observe now that Z is integrable, since from Itô’s isometry :
Define next the larger filtration \(\mathbb {G}=(\mathcal {G}_{t})_{t\geq 0}\) defined as \(\mathcal {G}_{t}=\sigma ((W_{u}, u \geq 0), (\beta _{u}, u\leq g_{t}^{f}(W))\). Using the tower property of conditional expectations :
since, conditionally to some scenario ω(hence with t↦W_{t}(ω)some fixed continuous path), the random variable \(\int _{g_{s}^{f}(W(\omega))}^{g_{t}^{f}(W(\omega))} h_{u}(W(\omega)) d\beta _{u}\) is a centered Gaussian random variable with variance \(\int _{g_{s}^{f}(W(\omega))}^{g_{t}^{f}(W(\omega))} h^{2}_{u}(W(\omega)) du\) independent from \(\left (\beta _{u}, u\leq g_{s}^{f}(W(\omega)) \right)\), hence
□
It is interesting to point out here that the latent process \(\tilde {Z}_{t} = \int _{0}^{t} h_{u}(W) dB_{u}  \rho \psi (t)\) is, in general, not a martingale (not even a local martingale). One obtains a martingale thanks to the lazy timechange.
Example 6
We give below several examples of application of this proposition.

1.
Take h_{u}=1. Then, ψ=f and \(\left (B_{g_{t}^{f}(W)} \rho f\left (g_{t}^{f}(W)\right), \,t\geq 0\right)\) is a PWC martingale.
More generally, we may observe from the proof above that if H is a spacetime harmonic function (i.e. (t,z)→H(t,z) is \(\mathcal {C}^{1,2}\) and such that \(\frac {\partial H}{\partial t} + \frac {1}{2} \frac {\partial ^{2} H}{\partial z^{2}} =0\)), then the process
$$\left(H\left(B_{g_{t}^{f}(W)}\rho f \left(g_{t}^{f}(W) \right),\, \left(1\rho^{2}\right) g_{t}^{f}(W)\right),\; t \geq 0 \right) $$is a PWC martingale. Notice in particular that the latent process here is not, in itself, a martingale.

2.
Following the same idea, take \(h_{u}(W) = \frac {\partial H}{\partial z}(W_{u},u)\) for some harmonic function H. Then
$$\int_{0}^{g_{t}^{f}(W)}\! \frac{\partial H}{\partial z}(W_{u},u) dW_{u}\! =\! H\! \left(\! W_{g_{t}^{f}(W)}, g_{t}^{f} (W)\! \right)  H(0,0)\! =\! H \!\! \left(\! f \! \left(\! g_{t}^{f}(W)\! \right), g_{t}^{f}(W)\! \right)\,\, H(0,0) $$and the process
$$\left(\! \int_{0}^{g_{t}^{f}(W)} \frac{\partial H}{\partial z}(W_{u},u) dB_{u} \rho H \left(\! f \left(g_{t}^{f}(W)\! \right), g_{t}^{f}(W) \right), \, t \geq 0 \right) $$is a PWC martingale.

3.
Consider the stochastic process \(\tilde {Z}\) which timet value is defined as the stochastic integral of any \(\mathcal {C}^{1}\)function of the local time of W at 0 with respect to B up to time t. Then, the timechanged integral (Z_{t})_{t≥0}, \(Z_{t}:= \tilde {Z}_{g^{0}_{t}(W)}\), is a PWC martingale in its natural filtration. To see this, take f=0 and \(h_{u}=r\left (L^{0}_{u}\right)\) where r is a \(\mathcal {C}^{1}\) function and L^{0} denotes the local time of W at 0. Then, integrating by parts :
$$\int_{0}^{g_{t}^{f}(W)} r \left(L^{0}_{u}\right) dW_{u} = r \left(L_{g_{t}^{f}(W)} \right) W_{g_{t}^{f}(W)}  \int_{0}^{g_{t}^{f}(W)} W_{u} r^{\prime} \left(L^{0}_{u}\right) dL^{0}_{u} = 0 $$since the support of dL is included in {u,W_{u}=0}. Therefore, the process (Z_{t}, t≥0), \(Z_{t}:=\int _{0}^{g_{t}^{f}(W)} r\left (L^{0}_{u}\right)dB_{u}\) is a PWC martingale.
Numerical simulations
In this section, we briefly sketch the construction schemes to sample paths of the lazy clocks discussed above. These procedures have been used to generate Fig. 1. Finally, we illustrate sample paths and distributions of a specific martingale in [0,1] timechanged with a Poisson lazy clock.
Sampling of lazy clock and lazy martingales
By definition, the number of jumps of a lazy clock θ on [0,T] is countable, but may be infinite. Therefore, except in some specific cases (such as the Poisson lazy clock), an exact simulation is impossible. Using a discretization grid, the simulated trajectories of a lazy clock θ on [0,T] will take the form
where τ_{0}:=0 and τ_{1},τ_{2},… are (some of) the synchronization times of the lazy clock up to time T. We can thus focus on the sampling times τ_{1},τ_{2}… whose values are no greater than T.
Poisson lazy clock
Trajectories of a Poisson lazy clock θ_{t}(ω)=g_{t}(N(ω)) on a fixed interval [0,T] are very easy to obtain thanks to the properties of Poisson jump times.
Brownian lazy clock
Sampling a trajectory for a Brownian lazy clock requires the last zero of a Brownian bridge. This is the purpose of the following lemma.
Lemma 1
Let W^{x,y,t} be a Brownian bridge on [0,t],t≤T, starting at \(W_{0}^{{x,y,t}}=x\) and ending \(W_{t}^{{x,y,t}}=y\), and define its last passage time at 0 :
Then, the CDF F(x,y,t;s) of g_{t}(W^{x,y,t}) is given, for s∈[0,t] by :
In particular, the probability that W^{x,y,t} does not hit 0 during [0,t] equals:
Note also the special case when y=0 :
Proof
Using time reversion and the absolute continuity formula of the Brownian bridge with respect to the free Brownian motion (see Salminen (1997)), the PDF of g_{t}(W^{x,y,t}) is given, for y≠0, by :
Integrating over [0,t], we first deduce that
We shall now compute a modified Laplace transform of F, and then invert it. Integrating by parts and using (14), we deduce that :
Observe next that by a change of variable :
hence
and the result follows by inverting this Laplace transform thanks to the formulae, for a>0 and b>0 :
and
□
Simulating a continuous trajectory of a Brownian lazy clock θ in a perfect way is an impossible task. The reason is that (W_{t})_{t≥s} hits infinitely many times the level W_{s} during an arbitrary future period starting from s. In particular, the path t↦W_{t}(ω) crosses 0 infinitely many times in the time interval [0,ε] for every ε>0. See e.g. Baldi (2017, p. 5859, Remark 4) or Karatzas and Shreve (2005, p. 94, Problem 7.18). Consequently, it is impossible to depict such trajectories in a perfect way. Just like for the Brownian motion, one could only hope to sample trajectories on a discrete time grid, where the maximum stepsize provides some control about the approximation, and corresponds to a basic unit of time. By doing so, we disregard the specific jump times of θ, but focus on the supremum of the zeroes of a Brownian motion in these intervals. To do this, we proceed as follows.
Example 7
(PWC martingale on (0,1)) Let N be a Poisson process with intensity λ and \(\tilde {Z}\) be the Φmartingale (Jeanblanc and Vrins 2018) with constant diffusion coefficient η,
Then, the stochastic process Z defined as \(Z_{t}:=\tilde {Z}_{g_{t}(N)}\), t≥0, is a pure jump martingale on (0,1) with CDF
Some sample paths for \(\tilde {Z}\) and Z are drawn on Fig. 2. Notice that all the martingales \(\tilde {Z}\) given above can be simulated without error using the exact solution.
Figure 3 gives the CDF of Z and \(\tilde {Z}\) where the later is a Φmartingale. The main differences between these two sets of curves result from the fact that \(\mathbb {P} \left (\tilde {Z}_{t}=Z_{0} \right) = 0 \) for all t>0 while \(\mathbb {P} \left (Z_{t}=Z_{0} \right) = \mathbb {P} \left (\tilde {Z}_{g_{t}(N)} = Z_{0} \right) = \mathbb {P} (N_{t}=0)>0\) and that there is a delay resulting from the fact that Z_{t} correspond to some past value of \(\tilde {Z}\).
Conclusion and future research
Many applications, like mathematical finance, extensively rely on martingales. In this context, discrete or continuoustime processes are commonly considered. However, in some specific cases like when we work under partial information or when market quotes arrive in a scarce way, it is more realistic to assume that conditional expectations move in a more piecewise constant fashion. Such type of processes didn’t receive attention so far, and our paper aims at filling this gap. We focused on the construction of piecewise constant martingales that is, martingales whose trajectories are piecewise constant. Such processes are indeed good candidates to model the dynamics of conditional expectations of random variables under partial (punctual) information. The timechanged approach proves to be quite powerful: starting with a martingale in a given range, we obtain a PWC martingale by using a piecewise constant timechange process. Among those timechange processes, lazy clocks are specifically appealing: these are timechange processes staying always in arrears to the real clock, and that synchronizes to the calendar time at some random times. This ensures that θ_{t}≤t which is a convenient feature when one needs to sample trajectories of the timechange process. Such random times can typically be characterized as last passage times, and enjoy appealing tractability properties. The last jump time of a Poisson process before the current time for instance exhibits a very simple distribution. Other lazy clocks have been proposed as well, based on Brownian motions and Bessel processes, some of which rule out the probability mass at zero. We provided several martingales timechanged with lazy clocks (called lazy martingales) whose range can be any interval in \(\mathbb {R}\) (depending on the range of the latent martingale) and showed that the corresponding distributions can be easily obtained in closed form. Finally, we presented algorithms to sample Poisson and Brownian lazy clocks, thereby providing the reader with a workable toolbox to efficiently use piecewise constant martingales in practice.
This paper paves the way for further research, in either fields of probability theory and mathematical finance. Tractability and even more importantly, the martingale property results from the independence assumption between the latent martingale and the timechange process. It might be interesting however to consider cases where the sample frequency (synchronization rate of the lazy clock θ to the real clock) depends on the level of the latent martingale Z. Finding a tractable model allowing for this coupling remains an open question at this stage. On the other hand, it is yet unclear how dealing with more realistic processes like piecewise constant ones would impact hedging strategies and model completeness in finance. In fact, investigating this route is the purpose of a research project that we are about to initiate.
Notes
 1.
We are grateful to an anonymous referee for pointing this out.
Abbreviations
 CDF:

Cumulative distribution function
 CIR:

Cox–Ingersoll–Ross
 PWC:

Piecewise constant martingale
 PDF:

Probability distribution function
References
Aksamit, A., Jeanblanc, M.: Enlargement of Filtrations with Finance in View. Springer, Switzerland (2017).
Altman, E., Brady, B., Resti, A., Sironi, A.: The link between defaults and recovery rates: theory, empirical evidence, and implications. Technical report, Stern School of Business (2003).
Amraoui, S., Cousot, L., Hitier, S., Laurent, J. P.: Pricing CDOs with statedependent stochastic recovery rates. Quant. Finan. 12(8), 1219–1240 (2012).
Andersen, L., Sidenius, J.: Extensions to the gaussian copula: random recovery and random factor loadings. J. Credit Risk. 1(1), 29–70 (2004).
Baldi, P.: Stochastic Calculus. Universitext. Springer, Switzerland (2017).
Bertoin, J.: Lévy processes, volume 121 of Cambridge Tracts in Mathematics. Cambridge University Press, Cambridge (1996).
Boel, R., Varaiya, P., Wong, E.: Martingales on jump processes. I. Representation results. SIAM J. Control. 13(5), 999–1021 (1975).
Boel, R., Varaiya, P., Wong, E.: Martingales on jump processes. II. Applications. SIAM. J. Control. 13(5), 1022–1061 (1975).
Cont, R., Tankov, P.: Financial Modelling with Jump Processes. Chapman & Hall, USA (2004).
Dellacherie, C., Maisonneuve, B., Meyer, P. A.: Probabilités et Potentiel  Processus de Markov. Hermann, France (1992).
Gaspar, R., Slinko, I.: On recovery and intensity’s correlation  a new class of credit models. J. Credit Risk. 4(2), 1–33 (2008).
Gradinaru, M., Roynette, B., Vallois, P., Yor, M.: Abel transform and integrals of Bessel local times. Ann. Inst. H. Poincaré Probab. Statist. 35(4), 531–572 (1999).
Gradshteyn, I. S., Ryzhik, I. M.: Table of integrals, series, and products. seventh edition. Elsevier/Academic Press, Amsterdam (2007).
Herdegen, M., Herrmann, S.: Single jump processes and strict local martingales. Stoch. Process. Appl. 126(2), 337–359 (2016).
Jacod, J., Skorohod, A. V.: Jumping filtrations and martingales with finite variation. In: Séminaire de Probabilités, XXVIII, volume 1583 of Lecture Notes in Math, pp. 21–35. Springer, Berlin (1994).
Jeanblanc, M., Vrins, F.: Conic martingales from stochastic integrals. Math. Financ. 28(2), 516–535 (2018).
Jeanblanc, M., Yor, M., Chesney, M.: Martingale Methods for Financial Markets. Springer Verlag, Berlin (2007).
Kahale, N.: Analytic crossing probabilities for certain barriers by Brownian motion. Ann. Appl. Probab. 18(4), 1424–1440 (2008).
Karatzas, I., Shreve, S.: Brownian Motion and Stochastic Calculus. Springer, New York (2005).
Mansuy, R., Yor, M.: Random Times and Enlargement of Filtrations in a Brownian Setting. Lecture Notes in Mathematics. Springer, Berlin Heidelberg (2006).
Markit: ISDA CDS Standard Model. Technical report (2004). http://www.cdsmodel.com/cdsmodel/.
Protter, P.: Stochastic Integration and Differential Equations. Second edition. Springer, Berlin (2005).
Rainer, C.: Projection d’une diffusion sur sa filtration lente. In: Séminaire de Probabilités, XXX, volume 1626 of Lecture Notes in Math, pp. 228–242. Springer, Berlin (1996).
Revuz, D., Yor, M.: Continuous martingales and Brownian motion. SpringerVerlag, NewYork (1999).
Salminen, P.: On the first hitting time and the last exit time for a Brownian motion to/from a moving boundary. Adv. Appl. Probab. 20(1), 411–426 (1988).
Salminen, P.: On last exit decompositions of linear diffusions. Studia. Sci. Math. Hungar. 33(1–3), 251–262 (1997).
Shreve, S. E.: Stochastic Calculus for Finance vol. II  Continuoustime models. Springer, New York (2004).
Vrins, F.: Characteristic function of timeinhomogeneous Lévydriven OrnsteinUhlenbeck processes. Stat. Probab. Lett. 116, 55–61 (2016).
Acknowledgements
The authors are grateful to M. Jeanblanc, D. Brigo, K. Yano, and G. Zhengyuan for stimulating discussions about an earlier version of this manuscript. We wish to thank the two anonymous referees and the AE whose many suggestions helped to improve the presentation of this paper.
Funding
This research benefitted from the support of the Chaire Marchés en Mutation from the Fédération Bancaire Française. The project “Dynamic Modeling of Recovery Rates” receives the support of WallonieBruxelles International and of the Fonds de la Recherche Scientifique, of the Ministère Français des Affaires étrangères et européennes, of the Ministère de l’Enseignement supérieur et de la Recherche via the instrument Partenariats Hubert Curien. This work was supported by the Fonds de la Recherche Scientifique FNRS under Grant J.0037.18.
Availability of data and materials
Data sharing not applicable to this article as no datasets were generated or analysed during the current study.
Author information
Affiliations
Contributions
The authors contributed equally to this paper. They both read and approved the final manuscript and jointly bear full responsibility regarding potential remaining errors.
Corresponding author
Correspondence to Frédéric Vrins.
Ethics declarations
Authors’ information
C.P. is with the Laboratoire de Mathématiques et Modélisation d’Évry. He is the author of the monograph Peacocks and Associated Martingales, With Explicit Constructions with F. Hirsh, B Roynette, and M. Yor. F.V. served as a quantitative analyst on the trading floor of a major European bank, before moving back to academia. He is Research Director of the Louvain Finance Center (LFIN) and Faculty Member of the Center for Operations Research and Econometrics (CORE).
Competing interests
The authors declare that they have no competing interests.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Received
Accepted
Published
DOI
Keywords
 Timechange process
 Last passage time
 Martingale
 Bounded martingale
 Jump martingale
 Subordinator