-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
210 lines (197 loc) · 10.5 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="description" content="Zeyu Ding - Ph.D. candidate in Statistics">
<meta name="author" content="Zeyu Ding">
<link rel="stylesheet" href="style.css">
<link href="https://fonts.googleapis.com/css2?family=Playfair+Display:wght@400;600&family=Lora:wght@400;500&display=swap" rel="stylesheet">
<link rel="icon" href="assets/images/favicon.ico" type="image/x-icon">
<title>Zeyu Ding - Personal Website</title>
</head>
<body>
<header>
<img src="assets/images/profile.jpg" alt="Profile Picture" class="profile-pic">
<h1>Zeyu Ding</h1>
<p>Ph.D. Candidate in Statistics</p>
</header>
<nav>
<ul>
<li><a href="#about">About Me</a></li>
<li><a href="#education">Education</a></li>
<li><a href="#research">Research</a></li>
<li><a href="#projects">Projects</a></li>
<li><a href="#experience">Experience</a></li>
<li><a href="#academic-activities">Academic Activities</a></li>
<li><a href="#skills">Skills</a></li>
<li><a href="#contact">Contact</a></li>
</ul>
</nav>
<main>
<section id="about">
<h2>About Me</h2>
<p>I am a Ph.D. candidate at TU Dortmund specializing in Bayesian Statistics, Machine Learning, and Data Compression techniques for high-dimensional models. My research aims to develop efficient statistical methods for analyzing and modeling complex, high-dimensional data, with applications in scientific computing and large-scale simulation. My core expertise lies in Bayesian Statistics, and I am particularly interested in exploring its potential in advancing generative modeling methods.</p>
</section>
<section id="education">
<h2>Education</h2>
<ul>
<li><strong>Ph.D. in Statistics</strong>, TU Dortmund, Germany (2021 - 2025)</li>
<li><strong>M.Sc. in Quantitative Economics</strong>, Georg-August-University Göttingen, Germany (2017 - 2020)</li>
<li><strong>B.Sc. in Quantitative Economics</strong>, Xi’an Jiaotong University, China (2011 - 2015)</li>
</ul>
</section>
<section id="research">
<h2>Research Interests</h2>
<ul>
<li>Bayesian Statistics and Machine Learning</li>
<li>High-dimensional Data Compression and Approximation</li>
<li>Generative Models: GANs, Diffusion Models, and Normalizing Flows</li>
<li>Exploring Transformer Architectures for High-dimensional Generative Modeling</li>
<li>Integrating Bayesian Methods with Deep Learning Frameworks</li>
</ul>
</section>
<section id="projects">
<h2>Projects</h2>
<ul>
<li>
<strong>Big Data for Advanced Classification Models</strong> (2021 - 2023)<br>
Developed scalable Bayesian algorithms for high-dimensional probit and logistic regression models.
</li>
<li>
<strong>AI for Physics (KISS Project)</strong> (2023 - Present)<br>
Applied advanced Bayesian and Monte Carlo methods for particle physics simulation.
</li>
<li>
<strong>Big Data for Copula Models</strong> (2023 - Present)<br>
Created efficient data compression algorithms for multivariate conditional transformation models.
</li>
</ul>
</section>
<section id="experience">
<h2>Working Experience</h2>
<ul>
<li><strong>Scientific Researcher</strong>, The Lamarr Institute for Machine Learning and Artificial Intelligence, Germany (2024 - Present)</li>
<li><strong>Scientific Researcher</strong>, TU Dortmund, Germany (2021 - Present)</li>
<li><strong>Intern in Quantitative Risk Management</strong>, Daimler Mobility AG, Germany (2019 - 2020)</li>
<li><strong>Intern in Risk Management</strong>, China Construction Bank, Frankfurt Branch (2019)</li>
</ul>
</section>
<!-- <section id="publications">
<h2>Publications</h2>
<ul>
<li><strong>Scalable Bayesian p-Generalized Probit and Logistic Regression</strong>, Advances in Data Analysis and Classification (2024)</li>
<li><strong>Bayesian Analysis for Dimensionality Reduction</strong>, Machine Learning under Resource Constraints (2023)</li>
<li><strong>Efficiency Coresets Techniques for MCTMs</strong>, on going work (submitted 2024)</li>
<li><strong>A Benchmark Suite for Monte Carlo Sampling Algorithms</strong>, on going work (submitted 2024)</li>
<li><strong>Adaptive Sliced Maximum Mean Discrepancy with Generalized Kernels and Random Fourier Features</strong>, on going work (Planned 2025)</li>
<li><strong>Enhancing Score Matching with P-Normalized Kernels: Theory and Langevin Dynamics Implementation</strong>, on going work (Planned 2025)</li>
<li><strong>Regularization and Prior Choice for the Bayesian Generalized Probit Model</strong>, on going work (Planned 2025)</li>
</ul>
</section>
-->
<section id="academic-activities">
<h2>Academic Activities</h2>
<!-- Publications Section -->
<div id="publications">
<h3>Publications</h3>
<ul>
<li>
<strong>Scalable Bayesian p-Generalized Probit and Logistic Regression</strong><br>
<em>Advances in Data Analysis and Classification, 2024</em><br>
Developed scalable Bayesian algorithms for high-dimensional classification problems.
</li>
<li>
<strong>Bayesian Analysis for Dimensionality and Complexity Reduction</strong><br>
<em>Machine Learning under Resource Constraints, deGruyter, Berlin, 2023</em><br>
Unified Bayesian approaches for dimensionality reduction in resource-constrained environments.
</li>
<li>
<strong>Efficiency Coresets Techniques for Multivariate Conditional Transformation Models</strong><br>
<em>submitted, 2024</em><br>
Proposed innovative coreset methods for high-dimensional data compression in generative models.
</li>
<li>
<strong>A Benchmark Suite for Monte Carlo Sampling Algorithms</strong><br>
<em>submitted, 2024 </em><br>
Developed new Monte Carlo sampling test metrics for academic and non-adademic users.
</li>
</ul>
</div>
<!-- Talks Section -->
<div id="talks">
<h3>Talks</h3>
<ul>
<li>
<strong>A Benchmark Suite for Monte Carlo Sampling Algorithms</strong><br>
<em>18th International Conference on Computational and Methodological Statistics (CMStatistics), KCL, London, Dec. 2024</em><br>
Poster presentation
</li>
<li>
<strong>Artificial Intelligence for Large-Scale Scientific Simulations</strong><br>
<em>KISS Project Workshop, University of Hamburg, Feb. 2024</em><br>
Explored AI techniques in high-energy physics simulations with CERN’s LHC data.
</li>
<li>
<strong>Efficiency Coresets Techniques for Multivariate Conditional Transformation Models</strong><br>
<em>17th International Conference on Computational and Methodological Statistics (CMStatistics), Berlin, Dec. 2023</em><br>
Presented data compression techniques for multivariate conditional transformations.
</li>
<li>
<strong>Scalable Bayesian p-Generalized Probit and Logistic Regression via Coresets</strong><br>
<em>16th International Conference on Computational and Methodological Statistics (CMStatistics), KCL, London, Dec. 2022</em><br>
Discussed computational efficiency in Bayesian high-dimensional classification.
</li>
<li>
<strong>6th International Summer School 2022 on Machine Learning under Resource Constraints </strong><br>
<em>Poster, TU Dortmund, Sep. 2022</em><br>
Topics regarding Bayesian models and coresets approaches
</li>
</ul>
</div>
<!-- Ongoing Research Section -->
<div id="ongoing-research">
<h3>Ongoing Research</h3>
<ul>
<li>
<strong>Adaptive Sliced Maximum Mean Discrepancy with Generalized Kernels and Random Fourier Features</strong><br>
</li>
<li>
<strong>Enhancing Score Matching with P-Normalized Kernels: Theory and Langevin Dynamics Implementation</strong><br>
</li>
<li>
<strong>Regularization and Prior Choice for the Bayesian Generalized Probit Model</strong><br>
</li>
</ul>
</div>
</section>
<section id="skills">
<h2>Skills</h2>
<h3>Programming Languages</h3>
<ul>
<li>Advanced: Python, R</li>
<li>Proficient: SAS, Julia, SQL</li>
<li>Intermediate: VBA, PySpark, PyTorch</li>
</ul>
<h3>Statistical and Machine Learning Expertise</h3>
<ul>
<li>Bayesian Methods: MCMC, Prior Design, Model Selection</li>
<li>Machine Learning: Gradient Boosting, Random Forests, SVMs</li>
<li>Deep Learning: Neural Networks, RNNs, LSTMs, Bayesian Neural Networks</li>
<li>Statistical Modeling: GLMs, Time Series (ARIMA, GARCH, etc.)</li>
</ul>
</section>
<section id="contact">
<h2>Contact</h2>
<ul>
<li>Email: <a href="mailto:zeyu.ding@tu-dortmund.de">zeyu.ding@tu-dortmund.de</a></li>
<li>GitHub: <a href="https://github.com/zeyudsai" target="_blank">zeyudsai</a></li>
<li>LinkedIn: <a href="https://www.linkedin.com/in/zeyu-ding-sai/" target="_blank">Zeyu Ding</a></li>
</ul>
</section>
</main>
<footer>
<p>© 2024 Zeyu Ding. All rights reserved.</p>
</footer>
</body>
</html>