This semester’s MMathSci thesis talks will be held on Wednesday 30 Nov. All are welcome to come. There is a mix of in person and online talks, between 10am-12pm and 2:30-4:30pm. All relevant information is below. I hope to see you there! Location: Carslaw AGR and https://uni-sydney.zoom.us/j/89458130430 Schedule: 10:00 - Mingxi Huang (online) 10:30 - Tiancong Cheng (in person) 11:00 - Shuyang Mo (online) 11:30 - Richard Alessi (in person) 2:30 - Muzhi Yu (online) 3:00 - Paul Fortuin (in person) 3:30 - Ruixuan Bi (online) 4:00 - Zhoujiajing Wang (online) Talk Info: 1. Richard Alessi Title: Allocation strategies in binary response trials. Abstract: This thesis provides an examination of strategies to sequentially allocate patients to treatments within the setting of a clinical trial with a binary outcome in the presence of known covariates with the aim of making these techniques accessible in business contexts with scenarios analogous to clinical trials.. We explore a range of techniques through a simulation study with differing performance. The allocation strategy that best suits a situation is dependent on the relative importance of having a high success rate and making reliable inference as well as the treatment effect size and sample size. 2. Ruixuan Bi Title: On Theory and Applications of Time Series Models Abstract: Volatility is a statistical measure of the dispersion of returns for a given stock or market index. In time series literature, models which attempt to explain the changes in conditional variance are generally known as conditional heteroscedastic models. In this study, fractionally integrated GARCH (FIGARCH) models are applied to financial time series with long memory behaviour. Our study also introduces stochastic volatility (SV) models whose main advantage is to consider a random component adaptable to abrupt changes. The parameter estimation method based on the state-space methods is proved and found to be accurate through a simulation study. Following this, these models are used to forecast the one-step-ahead volatility of Nasdaq 100 data. The results indicate the good performance of FIGARCH and SV models in estimating financial market volatility. 3. Tiancong Cheng Title: Analysis and Applications of Bilinear Time Series models with Heteroscedastic Errors Abstract: ARMA models are used in modeling linear time series. However, there are many time series in practice which are not linear. In order to analyse certain time series, Granger and Anderson (1978) proposed a family of bilinear time series models with errors are a secquence of uncorrelated random variables with zero mean and constant variance. As many financel time series, the constant variance of the noise is too restrictive and consider as heteroscedastic. In order to model bilinear time series with time dependent variance the generalize autoregressive conditional hterscedastic (GARCH) can be used as innovations. This project investigates the theory and application of bilinear model with GARCH errors and present a simulation study and some applications using real word data. 4. Paul Fortuin Title: The Bayesian Lasso: Approximation and Standard Errors Abstract: The Bayesian Lasso proposed by Park and Casella is a model in Statistics and Machine learning that has a variety of applications. Park and Casellaâs model utilize a Laplacian prior to increase sparsity of coefficients in linear regression analysis. We examine Park and Casellaâs methods, the use of Gibbs sampling, a Markov Chain Monte Carlo Method, to evaluate the model. Variational Bayesian approximations(Variational Bayes) are introduced as a method of inference to model the Bayesian Lasso. The performance of Variational Bayesian approximations are discussed and their tendency to under-approximate posterior standard errors is shown. We propose Bayesian Expectation Maximisation as an alternative to Variational Bayes and explore its performance in calculating the standard errors of the Bayesian Lasso. 5. Mingxi Huang Title: Sampling triangulations of 4-manifolds using the Metropolis-Hastings algorithm Abstract: Manifolds form a fundamental class of spaces studied in topology, and the notion of a manifold generalizes the concept of Euclidean space. An ndimensional topological manifold is a space that locally looks like the n-dimensional Euclidean space. Consequently, knowing that a space is a manifold does not tell us much about its global structure. To study the properties of a manifold, it is helpful to triangulate it, that is, to construct a homeomorphism to a simplicial complex. The global structure of a higher dimensional manifold is hardly as directly observed. Hence, triangulations are a good tool to study the global structure and properties of n-dimensional manifolds. 4-manifolds are notionally difficult to study in topology, and very little is known about general 4-manifolds. Typical representative characteristics of 4- manifolds will be studied in various fields. This theoretical difficulty is the reason we choose to test our method in the 4-dimensional setting. In order to figure out the structures of 4-manifolds, plenty of triangulations of 4-manifolds are necessary. Pachner moves related to the 4-dimensional triangulations are called 4-dimensional Pachner moves. There are five type 4-dimensional Pachner moves which will be explained in detail. Due to the Markov property of the changes between triangulations, in the Pachner graph, we can sample triangulations of 4-manifolds with the MetropolisHastings algorithm. Metropolis-Hastings algorithm is a Markov chain Monte Carlo method that simulates models when the prior probability distribution is unknown. Many larger triangulations can be sampled starting with an arbitrary triangulation by constructing the proposal distribution and acceptance ratio of the Metropolis-Hastings method. In the thesis, we describe a MetropolisHastings method to sample large quantities of triangulations of a 4-manifold M using an initial triangulation T of M as a seed. 6. Shuyang Mo: Title: Semi-parametric estimation of Autoregressive Conditional Duration (ACD) models Abstract: With rapid development of computer technology in recent years, more and more high-frequency data have been collected, especially in finance. For instance, transaction duration of stock is an example of high freqency data. As a result, the study of a model exploring such data is necessary. In this thesis, we will discuss a popular model called Autoregressive Conditional Duration (ACD) model to explore such data. We use an estimation method based on estimating functions to estimate unknown parameters of the ACD model. We compare this method with the classical approach based on maximum likelihood to see the performance, by producing some simulations and applications with real world data. 7. Zhoujiajing Wang Title: Bayesian Estimation of Long Memory and Related Time Series Models in Finance Abstract: Long-memory processes, also known as long-range-dependent processes, have evolved into an essential part of time series analysis during the last several decades. Long memory processes are characterized by slowly decaying autocorrelations and by a spectral density exhibiting a pole at the origin. These features dramatically changed the statistical behavior and affected estimates and predictions. As a consequence, many theoretical detections and methodologies that were formerly applicable to the analysis of short-memory time series are no longer appropriate for long-memory processes. The purpose of this thesis is to provide a study of the theory and methods developed to estimate with long-memory time series data using classical methods and Bayesian procedure. Also, this presents these methodologies with specific applications to real-life financial time series. This thesis begins with a review of the work on long-memory processes and discusses the theory on fractionally integrated autoregressive moving average (ARFIMA) processes, where we deal with a non-integer differencing parameter. We use the popular S&P 500 index and its returns in parameter estimation with MLE and Bayesian procedure. 8. Muzhi Yu: Title: Variational Bayes and Horseshoe prior for support vector machine. Abstract: Support vector machine is a popular classifier in machine learning nowadays with many advantages. It can help us due with many linear problems and even non-linear ones with some kernel function. However, the traditional SVM has the assumption that the samples are from the independent identical distribution. So Bayesian inference is introduced to it. Gibbs sampling is a special case of Markov chain Monte Carlo to simulate the random values as samples from posterior distribution to achieve the Bayesian inference, which usually costs too much time. Therefore, variational Bayes, which minimizes the gap between the marginal likelihood and the lower bound and uses lower bound to approximate the posterior distribution, is used to modify it and save time. But it can’t do the feature selection when fitting model, which is important when do the analysis. So we induce the Horseshoe prior to it, and then it can cost less time that the Gibbs sampling by variational Bayes and also do the feature selection by inducing Horseshoe prior. And we find when facing the problem that when the dimension of the features is equal or larger than the number of observed samples, our model behaves better than other models and can do a good feature selection. It’s a good way to solve the disadvantages of linear support vector machine, and in future can also introduce it to non-linear support vector machine as many practical problems are non-linear ones.