Abstract

Analyzing clinical diagnosis along with high-dimensional imaging data,

while accounting for the piecewise constant nature of the imaging, presents challenges to existing statistical approaches. In this paper, we propose a generalized

tensor regression framework with Internal Variation (IV) regularization to address these challenges. The inclusion of IV regularization allows for the explicit

utilization of the rich spatial structure, particularly the piecewise constant nature

of high-order imaging data, albeit with a more complex algorithm and demanding theoretical investigation. We develop an efficient IV regularized optimization

procedure for estimating unknown scalar and tensor coefficients. We investigate

the theoretical properties of scalar and tensor coefficient estimates, especially

the error bounds of regularized tensor coefficient estimates. Extensive numerical

studies assess the finite sample performance of our method, demonstrating its

superiority over existing approaches. Finally, we apply the proposed method to

a chronic sinusitis computed tomography (CT) imaging dataset and identify the

most activated subregion across the maxillary sinus cavity associated with the

diagnosis.

All authors contributed equally to this paper, and their names are listed in alpha-

Information

Preprint No.SS-2024-0281
Manuscript IDSS-2024-0281
Complete AuthorsYang Bai, Ting Li, Yang Sui
Corresponding AuthorsYang Bai
Emailsstatbyang@mail.shufe.edu.cn

References

  1. Bi, X., X. Tang, Y. Yuan, Y. Zhang, and A. Qu (2021). Tensors in statistics. Annual review of statistics and its application 8(1), 345–368.
  2. Blazère, M., J.-M. Loubes, and F. Gamboa (2014). Oracle inequalities for a group lasso procedure applied to generalized linear models in high dimension. IEEE Transactions on Information Theory 60(4), 2303–2318.
  3. Boyd, S., N. Parikh, E. Chu, B. Peleato, J. Eckstein, et al. (2011). Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine learning 3(1), 1–122.
  4. Bunea, F. (2008). Honest variable selection in linear and logistic regression models via ℓ1 and ℓ1 + ℓ2 penalization.
  5. Burnham, K. P. and D. R. Anderson (2004). Multimodel inference: understanding aic and bic in model selection. Sociological methods & research 33(2), 261–304.
  6. Capelli, M. and P. Gatti (2016). Radiological study of maxillary sinus using cbct: relationship between mucosal thickening and common anatomic variants in chronic rhinosinusitis. Journal of Clinical and Diagnostic research: JCDR 10(11), MC07.
  7. Cheng, G. and J. Z. Huang (2010). Bootstrap consistency for general semiparametric mestimation. The Annals of Statistics 38(5), 2884–2915.
  8. Cho, S. H., T. H. Kim, K. R. Kim, J.-M. Lee, D.-K. Lee, J.-H. Kim, J.-J. Im, C.-J. Park,
  9. and K.-G. Hwang (2010). Factors for maxillary sinus volume and craniofacial anatomical features in adults with chronic rhinosinusitis. Archives of Otolaryngology–Head & Neck Surgery 136(6), 610–615.
  10. Deeb, R., P. N. Malani, B. Gil, K. Jafari-Khouzani, H. Soltanian-Zadeh, S. Patel, and M. A.
  11. Zacharek (2011). Three-dimensional volumetric measurements and analysis of the maxillary sinus. American journal of Rhinology & Allergy 25(3), 152–156.
  12. Feng, L., X. Bi, and H. Zhang (2021). Brain regions identified as being associated with verbal reasoning through the use of imaging regression via internal variation. Journal of the American Statistical Association 116(533), 144–158.
  13. Fokkens, W. J., V. J. Lund, C. Hopkins, P. W. Hellings, R. Kern, S. Reitsma, S. Toppila-Salmi,
  14. M. Bernal-Sprekelsen, J. Mullol, I. Alobid, et al. (2020). European position paper on rhinosinusitis and nasal polyps 2020. Rhinology 58(Suppl S29), I–+.
  15. Hastan, D., W. Fokkens, C. Bachert, R. Newson, J. Bislimovska, A. Bockelbrink, P. Bousquet, G. Brozek, A. Bruno, S. Dahlén, et al. (2011). Chronic rhinosinusitis in europe–an underestimated disease. a ga2len study. Allergy 66(9), 1216–1223.
  16. Hinshaw, S. P. and R. M. Scheffler (2014). The ADHD explosion: Myths, medication, money, and today’s push for performance. Oxford University Press.
  17. Hirsch, A. G., W. F. Stewart, A. S. Sundaresan, A. J. Young, T. L. Kennedy, J. Scott Greene,
  18. W. Feng, B. K. Tan, R. P. Schleimer, R. C. Kern, et al. (2017). Nasal and sinus symptoms and chronic rhinosinusitis in a population-based sample. Allergy 72(2), 274–281.
  19. Jack, C. R., D. S. Knopman, W. J. Jagust, L. M. Shaw, P. S. Aisen, M. W. Weiner, R. C.
  20. Petersen, and J. Q. Trojanowski (2010). Hypothetical model of dynamic biomarkers of the alzheimer’s pathological cascade. The Lancet Neurology 9(1), 119–128.
  21. Kim, H. Y., M.-B. Kim, H.-J. Dhong, Y. G. Jung, J.-Y. Min, S.-K. Chung, H. J. Lee, S. C.
  22. Chung, and N. G. Ryu (2008). Changes of maxillary sinus volume and bony thickness of the paranasal sinuses in longstanding pediatric chronic rhinosinusitis. International journal of Pediatric Otorhinolaryngology 72(1), 103–108.
  23. Kolda, T. G. and B. W. Bader (2009). Tensor decompositions and applications. SIAM review 51(3), 455–500.
  24. Li, C. and H. Zhang (2021). Tensor quantile regression with application to association between neuroimages and human intelligence. The Annals of Applied Statistics 15(3), 1455 – 1477.
  25. Li, X., D. Xu, H. Zhou, and L. Li (2018). Tucker tensor regression and neuroimaging analysis. Statistics in Biosciences 10(3), 520–545.
  26. Liu, Y., J. Liu, Z. Long, C. Zhu, Y. Liu, J. Liu, Z. Long, and C. Zhu (2022). Tensor regression. Springer.
  27. Liu, Z., C. Y. Lee, and H. Zhang (2024). Tensor quantile regression with low-rank tensor train estimation. The Annals of Applied Statistics 18(2), 1294–1318.
  28. Lounici, K., M. Pontil, S. Van De Geer, and A. B. Tsybakov (2011). Oracle inequalities and optimal inference under group sparsity.
  29. Lu, W., Z. Zhu, and H. Lian (2020). High-dimensional quantile tensor regression. Journal of Machine Learning Research 21(250), 1–31.
  30. Luo, Y. and A. R. Zhang (2024). Tensor-on-tensor regression: Riemannian optimization, overparameterization, statistical-computational gap and their interplay. The Annals of Statistics 52(6), 2583–2612.
  31. Mahdavinia, M. and L. C. Grammer III (2013). Chronic rhinosinusitis and age: is the pathogenesis different? Expert review of Anti-Infective Therapy 11(10), 1029–1040.
  32. Michel, V., A. Gramfort, G. Varoquaux, E. Eger, and B. Thirion (2011). Total variation regularization for fmri-based prediction of behavior. IEEE transactions on medical imaging 30(7), 1328–1340.
  33. Oseledets, I. V. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing 33(5), 2295–2317.
  34. Raskutti, G., M. Yuan, and H. Chen (2019). Convex regularization for high-dimensional multiresponse tensor regression. The Annals of Statistics 47(3), 1554–1584.
  35. Shashy, R. G., E. J. Moore, and A. Weaver (2004). Prevalence of the chronic sinusitis diagnosis in olmsted county, minnesota. Archives of Otolaryngology–Head & Neck Surgery 130(3), 320–323.
  36. Spencer, D., R. Guhaniyogi, and R. Prado (2019). Bayesian mixed effect sparse tensor response regression model with joint estimation of activation and connectivity. arXiv preprint arXiv:1904.00148.
  37. Su, J., W. Byeon, J. Kossaifi, F. Huang, J. Kautz, and A. Anandkumar (2020). Convolutional tensor-train lstm for spatio-temporal learning. Advances in Neural Information Processing Systems 33, 13714–13726.
  38. Wang, X., H. Zhu, and A. D. N. Initiative (2017). Generalized scalar-on-image regression models via total variation. Journal of the American Statistical Association 112(519), 1156–1168.
  39. Wu, S. and L. Feng (2023). Sparse kronecker product decomposition: a general framework of signal region detection in image regression. Journal of the Royal Statistical Society Series B: Statistical Methodology 85(3), 783–809.
  40. Yan, H., K. Paynabar, and M. Pacella (2019). Structured point cloud data analysis via regularized tensor regression for process modeling and optimization. Technometrics 61(3), 385–395.
  41. Yang, Y. and T. M. Hospedales (2017). Deep multi-task representation learning: A tensor factorisation approach. In International Conference on Learning Representations.
  42. Yu, R., G. Li, and Y. Liu (2018). Tensor regression meets gaussian processes. In International conference on artificial intelligence and statistics, pp. 482–490. PMLR.
  43. Zhang, J., Y. Cai, Z. Wang, and B. Wang (2019). Sparse and low-rank high-order tensor regression via parallel proximal method. arXiv preprint arXiv:1911.12965.
  44. Zhang, T., Y. Fu, and C. Li (2021). Hyperspectral image denoising with realistic data. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 2248– 2257.
  45. Zhou, H., L. Li, and H. Zhu (2013). Tensor regression with applications in neuroimaging data analysis. Journal of the American Statistical Association 108(502), 540–552.
  46. Zhou, I. Y., Y.-X. Liang, R. W. Chan, P. P. Gao, J. S. Cheng, Y. Hu, K.-F. So, and E. X.
  47. Wu (2014). Brain resting-state functional mri connectivity: morphological foundation and plasticity. Neuroimage 84, 1–10.
  48. Zou, C., H. Ji, J. Cui, B. Qian, Y.-C. Chen, Q. Zhang, S. He, Y. Sui, Y. Bai, Y. Zhong, et al.
  49. (2024). Preliminary study on ai-assisted diagnosis of bone remodeling in chronic maxillary sinusitis. BMC Medical Imaging 24(1), 140.

Acknowledgments

The authors thank the reviewers, associate editor, and co-editor for their

helpful suggestions and comments. In addition, Li’s research is partially

supported by the National Science Foundation of China, Grant 12571304,

the Shanghai Pujiang Programme (No. 24PJC030), and the Program for Innovative Research Team of Shanghai University of Finance and Economics.

Bai’s research is supported by the Program for Innovative Research Team of

Shanghai University of Finance and Economics and the Shanghai Research

Center for Data Science and Decision Technology.

Supplementary Materials

In the supplementary material, we present the complete algorithm for estimating the parameters and provide additional numerical results. In addi-

tion, we give some useful lemmas and proofs of theorems.

9.


Supplementary materials are available for download.