Nonlinear nonmodal stability theory
RR Kerswell - Annual Review of Fluid Mechanics, 2018 - annualreviews.org
This review discusses a recently developed optimization technique for analyzing the
nonlinear stability of a flow state. It is based on a nonlinear extension of nonmodal analysis …
nonlinear stability of a flow state. It is based on a nonlinear extension of nonmodal analysis …
Reverse time migration: A prospect of seismic imaging methodology
Reverse time migration (RTM) is a seismic imaging method to map the subsurface reflectivity
using recorded seismic waveforms. The practice in exploration seismology has long …
using recorded seismic waveforms. The practice in exploration seismology has long …
Filip: Fine-grained interactive language-image pre-training
Unsupervised large-scale vision-language pre-training has shown promising advances on
various downstream tasks. Existing methods often model the cross-modal interaction either …
various downstream tasks. Existing methods often model the cross-modal interaction either …
Efficient large-scale language model training on gpu clusters using megatron-lm
Large language models have led to state-of-the-art accuracies across several tasks.
However, training these models efficiently is challenging because: a) GPU memory capacity …
However, training these models efficiently is challenging because: a) GPU memory capacity …
Learning transferable visual models from natural language supervision
State-of-the-art computer vision systems are trained to predict a fixed set of predetermined
object categories. This restricted form of supervision limits their generality and usability since …
object categories. This restricted form of supervision limits their generality and usability since …
Revisiting stereo depth estimation from a sequence-to-sequence perspective with transformers
Stereo depth estimation relies on optimal correspondence matching between pixels on
epipolar lines in the left and right images to infer depth. In this work, we revisit the problem …
epipolar lines in the left and right images to infer depth. In this work, we revisit the problem …
[HTML][HTML] Combined scaling for zero-shot transfer learning
Recent developments in multimodal training methodologies, including CLIP and ALIGN,
obviate the necessity for individual data labeling. These approaches utilize pairs of data and …
obviate the necessity for individual data labeling. These approaches utilize pairs of data and …
Gpipe: Efficient training of giant neural networks using pipeline parallelism
Scaling up deep neural network capacity has been known as an effective approach to
improving model quality for several different machine learning tasks. In many cases …
improving model quality for several different machine learning tasks. In many cases …
Training deep nets with sublinear memory cost
We propose a systematic approach to reduce the memory consumption of deep neural
network training. Specifically, we design an algorithm that costs O (sqrt (n)) memory to train …
network training. Specifically, we design an algorithm that costs O (sqrt (n)) memory to train …
Memory-efficient pipeline-parallel dnn training
Many state-of-the-art ML results have been obtained by scaling up the number of
parameters in existing models. However, parameters and activations for such large models …
parameters in existing models. However, parameters and activations for such large models …