# Ruin Probabilities under a Markovian Risk Model

Ruin Probabilities under a Markovian Risk Model In this paper, a Markovian risk model is developed, in which the occurrence of the claims is described by a point process {N(t)} t≥0 with N(t) being the number of jumps of a Markov chain during the interval [0, t]. For the model, the explicit form of the ruin probability Ψ(0) and the bound for the convergence rate of the ruin probability Ψ(u) are given by using the generalized renewal technique developed in this paper. Finally, we prove that the ruin probability Ψ(u) is a linear combination of some negative exponential functions in a special case when the claims are exponentially distributed and the Markov chain has an intensity matrix (q ij ) i,j∈E such that q m = q m1 and q i = q i(i+1), 1 ≤ i ≤ m−1. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Acta Mathematicae Applicatae Sinica Springer Journals

# Ruin Probabilities under a Markovian Risk Model

, Volume 19 (4) – Nov 2, 2015
10 pages      /lp/springer-journals/ruin-probabilities-under-a-markovian-risk-model-xkVp7HOfLW
Publisher
Springer Journals
Copyright © 2003 by Springer-Verlag Berlin Heidelberg
Subject
Mathematics; Applications of Mathematics; Math Applications in Computer Science; Theoretical, Mathematical and Computational Physics
ISSN
0168-9673
eISSN
1618-3932
DOI
10.1007/s10255-003-0136-9
Publisher site
See Article on Publisher Site

### Abstract

In this paper, a Markovian risk model is developed, in which the occurrence of the claims is described by a point process {N(t)} t≥0 with N(t) being the number of jumps of a Markov chain during the interval [0, t]. For the model, the explicit form of the ruin probability Ψ(0) and the bound for the convergence rate of the ruin probability Ψ(u) are given by using the generalized renewal technique developed in this paper. Finally, we prove that the ruin probability Ψ(u) is a linear combination of some negative exponential functions in a special case when the claims are exponentially distributed and the Markov chain has an intensity matrix (q ij ) i,j∈E such that q m = q m1 and q i = q i(i+1), 1 ≤ i ≤ m−1.

### Journal

Acta Mathematicae Applicatae SinicaSpringer Journals

Published: Nov 2, 2015

Access the full text.

Sign up today, get DeepDyve free for 14 days.