×

Optimal maintenance policies for a safety-critical system and its deteriorating sensor. (English) Zbl 1411.90131

Summary: We consider the integrated problem of optimally maintaining an imperfect, deteriorating sensor and the safety-critical system it monitors. The sensor’s costless observations of the binary state of the system become less informative over time. A costly full inspection may be conducted to perfectly discern the state of the system, after which the system is replaced if it is in the out-of-control state. In addition, a full inspection provides the opportunity to replace the sensor. We formulate the problem of adaptively scheduling full inspections and sensor replacements using a partially observable Markov decision process (POMDP) model. The objective is to minimize the total expected discounted costs associated with system operation, full inspection, system replacement, and sensor replacement. We show that the optimal policy has a threshold structure and demonstrate the value of coordinating system and sensor maintenance via numerical examples.

MSC:

90B25 Reliability, availability, maintenance, inspection in operations research
90C40 Markov and semi-Markov decision processes

References:

[1] S.C.Albright, Structural results for partially observable Markov decision processes, Oper Res27 (1979), 1041-1053. · Zbl 0433.90083
[2] S.Alizamir, F.deVéricourt, and P.Sun, Diagnostic accuracy under congestion, Manage Sci59 (2013), 157-171.
[3] J.L.Blackshire, V.Giurgiutiu, A.Cooney, and J.Doane, “Characterization of sensor performance and durability for structural health monitoring systems,” in: Society of Photo‐Optical Instrumentation Engineers (SPIE) Conference Series, SPIE (Society of Photo‐Optical Instrumentation Engineers), Bellingham, WA, 5770, 2005, pp. 66-74.
[4] D.Blackwell, “Comparison of experiments,” in: Proceedings of the 2nd Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley, CA, 1951, pp. 93-102. · Zbl 0044.14203
[5] D.Blackwell, Equivalent comparisons of experiments, Ann Math Stat24 (1953), 265-272. · Zbl 0050.36004
[6] J.Coble, P.Ramuhalli, R.Meyer, H.Hashemian, B.Shumaker, and D.Cummins, “Calibration monitoring for sensor calibration interval extension: Identifying technical gaps,” in: Proceedings of the Future of Instrumentation International Workshop, Institute of Electrical and Electronics Engineers (IEEE), Piscataway, NJ, 2012.
[7] M.Dada and R.Marcellus, Process control with learning, Oper Res42 (1994), 323-336. · Zbl 0805.90053
[8] A.Davies, Handbook of condition monitoring: Techniques and methodology, Chapman & Hall, London, England, 1998.
[9] K.Fowler ( (ed.)Editor), Mission‐critical and safety‐critical systems handbook: Design and development for embedded applications, Newnes, Burlington, MA, 2009.
[10] A.Ghasemi, S.Yacout, and M.S.Ouali, Optimal condition based maintenance with imperfect information and the proportional hazards model, Int J Prod Res45 (2007), 989-1012. · Zbl 1128.90326
[11] S.M.Gilbert and H.M.Bar, The value of observing the condition of a deteriorating machine, Naval Res Logist46 (1999), 790-808. · Zbl 0971.90022
[12] M.Givon and A.Grosfeld‐Nir, Using partially observed Markov processes to select optimal termination time of TV shows, Omega36 (2008), 477-485.
[13] A.Grosfeld‐Nir, Control limits for two‐state partially observable Markov decision processes, Eur J Oper Res182 (2007), 300-304. · Zbl 1128.90057
[14] W.R.Habel and A.Bismarck, Optimization of the adhesion of fiber‐optic strain sensors embedded in cement matrices: A study into long‐term fiber strength, J Struct Control7 (2000), 51-76.
[15] E.A.Hansen, “An improved policy iteration algorithm for partially observable MDPs,” in: Proceedings of the 10th Annual Conference on Advances in Neural Information Processing Systems (NIPS), MIT Press, Cambridge, MA, 1997, pp. 1015-1021.
[16] M.Hauskrecht, Value‐function approximations for partially observable Markov decision processes, J Artif Intell Res13 (2000), 33-94. · Zbl 0946.68131
[17] S.Karlin, Total positivity, Stanford University Press, Stanford, CA, 1968. · Zbl 0219.47030
[18] M.J.Kim and V.Makis, Joint optimization of sampling and control of partially observable failing systems, Oper Res61 (2013), 777-790. · Zbl 1273.90064
[19] V.Krishnamurthy and D.V.Djonin, Structured threshold policies for dynamic sensor scheduling—a partially observed Markov decision process approach, IEEE Trans Signal Process55 (2007), 4938-4957. · Zbl 1390.90335
[20] Y.Kuo, Optimal adaptive control policy for joint machine maintenance and product quality control, Eur J Oper Res171 (2006), 586-597. · Zbl 1090.90053
[21] L. LeCam. “Comparison of experiments: A short review,” in: T.Ferguson (ed.) and L.Shapley ( (ed.)Editors), Statistics, probability and game theory: Papers in honor of David Blackwell, Lecture Notes-Monograph Series, Institute of Mathematical Statistics (IMS), Hayward, CA, 1996, pp. 127-138. · Zbl 0996.60500
[22] M.Lévesque and L.M.Maillart, Business opportunity assessment with costly, imperfect information, IEEE Trans Eng Manage55 (2008), 279-291.
[23] W.S.Lovejoy, Some monotonicity results for partially observed Markov decision processes, Oper Res35 (1987), 736-743. · Zbl 0648.90087
[24] W.S.Lovejoy, A survey of algorithmic methods for partially observed Markov decision processes, Ann Oper Res28 (1991), 47-66. · Zbl 0717.90086
[25] W.S.Lovejoy, Computationally feasible bounds for partially observed Markov decision processes, Oper Res39 (1991), 162-175. · Zbl 0743.90110
[26] L.M.Maillart and L.Zheltova, Structured maintenance policies on interior sample paths, Naval Res Logist54 (2007), 645-655. · Zbl 1151.90375
[27] L.M.Maillart, T.G.Yeung, and Z.G.Icten, Selecting test sensitivity and specificity parameters to optimally maintain a degrading system, Proc Inst Mech Eng Part O J Risk Reliab225 (2011), 131-139.
[28] G.E.Monahan, A survey of partially observable Markov decision processes: Theory, models, and algorithms, Manage Sci28 (1982), 1-16. · Zbl 0486.90084
[29] M.Naghshvar and T.Javidi, “Information utility in active sequential hypothesis testing,” in: Proceedings of the 48th Annual Allerton Conference on Communication, Control, and Computing, Institute of Electrical and Electronics Engineers (IEEE), Piscataway, NJ, 2010, pp. 123-129.
[30] M.Ohnishi, H.Kawai, and H.Mine, An optimal inspection and replacement policy under incomplete state information, Eur J Oper Res27 (1986), 117-128. · Zbl 0623.90025
[31] P.Poupart, Exploiting structure to efficiently solve large scale partially observable Markov decision processes, PhD thesis, University of Toronto, 2005.
[32] M.L.Puterman, Markov decision processes: Discrete stochastic dynamic programming, John Wiley, Hoboken, NJ, 2005. · Zbl 1184.90170
[33] A.Ray and S.Phoha, Calibration and estimation of redundant signals for real‐time monitoring and control, Signal Process83 (2003), 2593-2605. · Zbl 1145.94375
[34] U.Rieder, Structural results for partially observed control models, Z Oper Res35 (1991), 473-490. · Zbl 0755.93083
[35] D.Rosenfield, Markovian deterioration with uncertain information, Oper Res24 (1976), 141-155. · Zbl 0375.90034
[36] S.M.Ross, Quality control under Markovian deterioration, Manag Sci17 (1971), 587-596. · Zbl 0221.90021
[37] R.D.Smallwood and E.J.Sondik, The optimal control of partially observable Markov processes over a finite horizon, Oper Res21 (1973), 1071-1088. · Zbl 0275.93059
[38] R.Srinivasan and A.K.Parlikad, Value of condition monitoring in infrastructure maintenance, Comput Ind Eng66 (2013), 233-241.
[39] C.Ulu and J.E.Smith, Uncertainty, information acquisition, and technology adoption, Oper Res57 (2009), 740-752. · Zbl 1226.90127
[40] C.C.White, III A Markov quality control process subject to partial observation, Manag Sci23 (1977), 843-852. · Zbl 0353.93060
[41] C.C.White, III, Optimal control‐limit strategies for a partially observed replacement problem, Int J Syst Sci10 (1979), 321-331. · Zbl 0399.93048
[42] C.C.White, III and D.P.Harrington, Application of Jensen’s inequality to adaptive suboptimal design, J Optim Theory Appl32 (1980), 89-99. · Zbl 0416.90075
[43] H.Zhang, Partially observable Markov decision processes: A geometric technique and analysis, Oper Res58 (2010), 214-228. · Zbl 1226.90131
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.