``And once the storm is over you won't remember how you made it through, how you managed to survive. You won't even be sure, in fact, whether the storm is really over. But one thing is certain. When you come out of the storm you won't be the same person who walked in. That's what this storm's all about.''

**-Haruki Murakami**

----------------------------------------------

----------------------------------------------

Over a quick four-day binge,
Patrick Kreitzberg, Kyle Lucke, and Oliver Serang wrote up
the fastest known method for performing selection on the
Cartesian sum $$Y=X_1+X_2+\cdots+X_m.$$ The technique is
important for Bayesian inference under additive
dependencies, computing the most abundant isotope peaks of
a compound, computing the most efficient *k*
configurations to build a product with *m*
independent supply lines, and more. Read the preprint
here.

----------------------------------------------

Having won its bid for the 2019 Serang Lab Olympics, Missoula hosted competitors at Bonner Park to compete in a triathlon of lab events: The first event is the longest frisbee throw with successful catch. Participants select a partner of choice, who gets a four second head start running before the competitor throws. Completed catches are marked with sparkling water (competitor's flavor of choice), and the best of three throws was used for scoring. The second event was the egg toss. In this case, competitors must work together. Each completed throw results in one step back for all remaining competitors, until only one egg remains. The last event is the crab walk race. Going to the telephone pole would be too easy (You know what else is easy? TV dinners! For once, ask the most of yourself!): competitors must go around the telephone pole and return.

**FINAL STANDINGS:**

COMPETITOR | FRISBEE RANK | EGG RANK | CRAB RANK | RANK SUM |
---|---|---|---|---|

MAX | 1 | 3 | 4 | 8 |

JAKE | 3 | 2 | 8 | 13 |

SEAN | 4 | 2 | 1 | 7 |

SARAH | 5 | 1 | 5 | 11 |

KYLE | 6 | 1 | 6 | 13 |

BLAKE | 6 | 3 | 3 | 12 |

PATRICK | 6 | 1 | 3 | 12 |

**GEWINNER:** Sean Rice wins the Strawberry-Watermelon bubble yum

----------------------------------------------

Oliver Serang has received an NSF CAREER award! This grant will fund the development of new combinatorics methods and a new season of the Exploring Scientific Wilderness podcast.

----------------------------------------------

Patrick Kreitzberg and Oliver Serang presented at ASMS
2019 in Atlanta, where Patrick got his first taste of
Szechuan food in his life.

To read Patrick's masters thesis (on computing *de
novo* neutral loss alphabets from mass spectra),
click here. To
get the code (MIT licese),
visit https://bitbucket.org/orserang/peak-bagger.

To see Kyle Lucke *et al.*'s poster (which was
presented by Oliver) on the the hitchhiking proteins
during protein inference, click here.

----------------------------------------------

Patrick Kreitzberg will defend his Masters Thesis on Wednesday, June 19th at 1PM in Social Sciences 362. Read his thesis here!

----------------------------------------------

The department of Computer Science at the University of Montana will now offer Cybersecurity (both undergraduate and graduate level)!

----------------------------------------------

----------------------------------------------

Patrick Kreitzberg's work on meta *de novo* mass
spectrometry methods, "The Alphabet Projection of Mass
Spectrometry Data", was a finalist for the best of the
University of Montana's GradCon 2019!

----------------------------------------------

To celebrate the new year, lab members jumped into the snowy Clark Fork and then everyone went to warm up with some ramen and then cool back down with a quick game of Race for the Galaxy over ice cream.

Only 170 days until the days start getting shorter again!

----------------------------------------------

Lab members went camping in the mountains east of Missoula. It was damp, but everyone worked together to strip the wood, saw/baton logs, and build a good fire before the snow came.

This cozy season, which is viewed as the time when the veil between the living and the dead is its thinnest, proved to be a perfect time to introduce everyone to "cheese dogs".

----------------------------------------------

During an afternoon of Patrick and Oliver playing fetch with Patrick's dog Butters, Butters was recorded in a video that is slated for use in offial promotion for the University of Montana.

----------------------------------------------

The Serang Lab has been awarded a COBRE grant. This grant will be used to hire UM graduate students to build new graphical models for two cutting-edge problems from computational biology.

----------------------------------------------

The University of Montana has awarded the Serang Lab a pilot project grant. This grant will fund a UM computer science student to work on an exciting project developing EvergreenForest, a new graphical models library.

----------------------------------------------

Oliver Serang's textbook, Code Optimization in C++11 is now online.

Check it out here.

██████╗ ██████╗ ██████╗ ███████╗ ██╔════╝██╔═══██╗██╔══██╗██╔════╝ ██║ ██║ ██║██║ ██║█████╗ ██║ ██║ ██║██║ ██║██╔══╝ ╚██████╗╚██████╔╝██████╔╝███████╗ ╚═════╝ ╚═════╝ ╚═════╝ ╚══════╝ ██████╗ ██████╗ ████████╗██╗███╗ ███╗██╗███████╗ █████╗ ████████╗██╗ ██████╗ ███╗ ██╗ ██╔═══██╗██╔══██╗╚══██╔══╝██║████╗ ████║██║╚══███╔╝██╔══██╗╚══██╔══╝██║██╔═══██╗████╗ ██║ ██║ ██║██████╔╝ ██║ ██║██╔████╔██║██║ ███╔╝ ███████║ ██║ ██║██║ ██║██╔██╗ ██║ ██║ ██║██╔═══╝ ██║ ██║██║╚██╔╝██║██║ ███╔╝ ██╔══██║ ██║ ██║██║ ██║██║╚██╗██║ ╚██████╔╝██║ ██║ ██║██║ ╚═╝ ██║██║███████╗██║ ██║ ██║ ██║╚██████╔╝██║ ╚████║ ╚═════╝ ╚═╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚═╝╚══════╝╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ ██╗███╗ ██╗ ██████╗ ██╗ ██╗ ██║████╗ ██║ ██╔════╝ ██╗ ██╗ ███║███║ ██║██╔██╗ ██║ ██║ ██████╗██████╗ ╚██║╚██║ ██║██║╚██╗██║ ██║ ╚═██╔═╝╚═██╔═╝ ██║ ██║ ██║██║ ╚████║ ╚██████╗ ╚═╝ ╚═╝ ██║ ██║ ╚═╝╚═╝ ╚═══╝ ╚═════╝ ╚═╝ ╚═╝

----------------------------------------------

Exploring Scientific Wilderness is a new podcast miniseries with short and sweet pieces of scientific brain candy.

███████╗████████╗███████╗██████╗ ██╔════╝╚══██╔══╝██╔════╝██╔══██╗ ███████╗ ██║ █████╗ ██████╔╝ ╚════██║ ██║ ██╔══╝ ██╔═══╝ ███████║ ██║ ███████╗██║ ╚══════╝ ╚═╝ ╚══════╝╚═╝ ██╗███╗ ██╗████████╗ ██████╗ ██║████╗ ██║╚══██╔══╝██╔═══██╗ ██║██╔██╗ ██║ ██║ ██║ ██║ ██║██║╚██╗██║ ██║ ██║ ██║ ██║██║ ╚████║ ██║ ╚██████╔╝ ╚═╝╚═╝ ╚═══╝ ╚═╝ ╚═════╝ ████████╗██╗ ██╗███████╗ ╚══██╔══╝██║ ██║██╔════╝ ██║ ███████║█████╗ ██║ ██╔══██║██╔══╝ ██║ ██║ ██║███████╗ ╚═╝ ╚═╝ ╚═╝╚══════╝ ██╗ ██╗██╗██╗ ██████╗ ███████╗██████╗ ███╗ ██╗███████╗███████╗███████╗ ██║ ██║██║██║ ██╔══██╗██╔════╝██╔══██╗████╗ ██║██╔════╝██╔════╝██╔════╝ ██║ █╗ ██║██║██║ ██║ ██║█████╗ ██████╔╝██╔██╗ ██║█████╗ ███████╗███████╗ ██║███╗██║██║██║ ██║ ██║██╔══╝ ██╔══██╗██║╚██╗██║██╔══╝ ╚════██║╚════██║ ╚███╔███╔╝██║███████╗██████╔╝███████╗██║ ██║██║ ╚████║███████╗███████║███████║ ╚══╝╚══╝ ╚═╝╚══════╝╚═════╝ ╚══════╝╚═╝ ╚═╝╚═╝ ╚═══╝╚══════╝╚══════╝╚══════╝

----------------------------------------------

After years of work, the beta version of
EvergreenForest has been released! It includes a fast,
lazy implementation of trimmed convolution trees, a
fast in-house FFT written in C++11 (using the lab's
work on methods for performing bit-reversed
permutation), and lots of other Waldmeister-flavored
goodies to activate your almonds.

You can read more about it and download it
here.

----------------------------------------------

Trimmed convolution trees use sparsity in the solution
space exploit sparsity in the input priors and
likelihood of a convolution tree to achieve an
algorithmic and practical speedup. For example, when
all input priors are $$X_i \in \{0,1\}$$ and the
likelihood $$Y=X_1 + X_2 + \cdots X_n \in \{0,1\},$$
all posteriors can be solved with trimmed convolution
trees in $$O(n)$$ rather than the $$O(n \log(n)
\log(n))$$ required by non-trimmed convolution
trees.

Trimmed convolution trees generalize this to use
narrower priors and likelihoods, even when some priors
have larger support than others or when the likelihood
distribution has are larger support than all priors,
smaller support, or anything, making the first method
$$\in o(nk \log(nk) \log(n))$$ on some problems with
$$n$$ inputs each with $$k$$ bins.

The paper (in review) includes a demo that uses
trimmed convolution trees to estimate posterior
probabilities for peoples' orders given their total
ice cream bill; the demo is, of course, written using
the menu from
Missoula's
own Big Dipper Ice Cream.

----------------------------------------------

The bit-reversed permutation is an important task for
implementing an in-place FFT library. This paper
explores several bit reversal methods on the x86
architecture, including new bit twiddling tricks and a
proposed template-recursive, cache-oblivious method,
which achieves state-of-the-art performance.

The paper draft can be found
here. The
repository (linked in the paper) includes translations
in the mother tongue of each student who participated
(Chinese, German, Turkish, and Spanish) to help
students throughout the world participate in the joy
of scientific computing. So when someone says, ``Bit
reversal? How?'' you can say, ``很好!''

----------------------------------------------

Florian Heyl, who is doing his masters Praktikum in the
group, has just had his first paper accepted for
publication. The paper (Heyl & Serang 2017,
PROGRAMMING) introduces and benchmarks the
``template-recursive iteration over tensors'' (TRIOT)
design pattern in C++11. The method enables easy
vectorizing over tensors of different shaps and does not
require tensor dimensions to be fixed at compile
time.

Download the free header-only
library here.

----------------------------------------------

The lab's
second paper on the
topic (Pfeuffer
& Serang 2016, Journal of Machine Learning Research)
builds upon the original idea, looking at the curve of
(modeled using orthogonal polynomials) of a constant
number of $$L_p$$ spaces. The way that these spaces bend
toward the true maximum can allow us to estimate the
maximum better than any particular numerically stable
p-norm.

The paper highlights the approach with faster prediction
of U.S. unemployment given S&P 500 index data (via a
fast Viterbi path computation).

----------------------------------------------

Max-convolution, a cousin of standard convolution but on
the semiring $$( \times, \max )$$ (consider multiplying
two polynomials but taking the maximum coefficient for
every term rather than the sum of coefficients for that
term), arises frequently in combinatorics and in
Bayesian inference. In 2006, Bremner et al. discovered
the first solution $$\in o(n^2);$$ theirs is a beautiful
method, but it's only slightly sub-quadratic and relies
on a reduction to APSP, and it is not fast enough for
large-scale use in Bayesian networks. However, this
problem can be approximated numerically by using $$L_p$$
space, where FFT can be used in $$\in O(n
\log(n))$$ (Serang
2015, Journal of Computational Biology). The method
can also be used to improve hidden Markov models (HMMs)
with with k states from $$O(k^2 n)$$ to $$O(k \log(k)~
n),$$ whenever the transition matrix is a Toeplitz
matrix.

Combined with the probabilistic convolution tree, the
method can be used for max-product inverence when $$Y =
X_1 + X_2 + \cdots + X_n.$$ This can be used to solve a
generalization of the knapsack problem on
integer-discretized supports.

----------------------------------------------

The probabilistic convolution tree (Serang 2014, PLOS ONE) is a new dynamic programming algorithm that solves a class of probabilistic problems, which are equivalent to a generalization of subset-sum on integer-discretized supports. When $$Y = X_1 + X_2 + \cdots + X_n$$ and each $$X_i \in \{0, 1, \ldots k-1\}$$ it computes all marginals simultaneously with a runtime $$\in O(n k \log(n k) \log(n)).$$

----------------------------------------------

.

.

.