FusionDTA: attention-based feature polymerizer and knowledge distillation for drug-target binding affinity prediction
W Yuan, G Chen, CYC Chen - Briefings in Bioinformatics, 2022 - academic.oup.com
… knowledge distillation for DTA tasks as an improvement in training strategy. Knowledge
distillation … knowledge from the teacher model with more parameters. Through transferring …
distillation … knowledge from the teacher model with more parameters. Through transferring …
Improving drug-target affinity prediction via feature fusion and knowledge distillation
… introduced a novel knowledge-distillation insights drug-target affinity prediction model with
… and explainable predictions. We benchmarked the model on public affinity prediction and …
… and explainable predictions. We benchmarked the model on public affinity prediction and …
Channel-wise knowledge distillation for dense prediction
… on knowledge distillation focus on classification tasks [10, 11, 15, 26, 35, 37, 43]. Our work
here aims to study efficient and effective distillation methods for dense prediction, … affinity map. …
here aims to study efficient and effective distillation methods for dense prediction, … affinity map. …
Cross-dimension affinity distillation for 3d em neuron segmentation
… knowledge from a 3Dteacher network to the 2D student network by minimizingthe affinity
prediction … By incorporatingonline knowledge distillation, our method further enhancesthe …
prediction … By incorporatingonline knowledge distillation, our method further enhancesthe …
A self-distillation embedded supervised affinity attention model for few-shot segmentation
Q Zhao, B Liu, S Lyu, H Chen - IEEE Transactions on Cognitive …, 2023 - ieeexplore.ieee.org
… affinity attention map to give a prior prediction of query target. Based on the two modules
mentioned above, we propose the Self-Distillation embedded Affinity … the knowledge distillation …
mentioned above, we propose the Self-Distillation embedded Affinity … the knowledge distillation …
Channel Affinity Knowledge Distillation for Semantic Segmentation
… predicted maps in the spatial domain, but channel distillation also may help to improve
segmentation performance. Additionally, pairwise pixel affinity … a novel Channel Affinity KD (CAKD…
segmentation performance. Additionally, pairwise pixel affinity … a novel Channel Affinity KD (CAKD…
A knowledge distillation-guided equivariant graph neural network for improving protein interaction site prediction performance
S Chen, Z Tang, L You, CYC Chen - Knowledge-Based Systems, 2024 - Elsevier
… ) and compared their prediction performances without the self-distillation module. …
distillation module by analyzing the prediction performance of Layer_1 with KD (with the self-distillation …
distillation module by analyzing the prediction performance of Layer_1 with KD (with the self-distillation …
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism
… Affinity value corresponds to smaller offset between the data distributions. In this paper, Affinity
… of data augmentation on the student’s prediction confidence and model calibration in KD …
… of data augmentation on the student’s prediction confidence and model calibration in KD …
Multi-task learning with knowledge distillation for dense prediction
… task-specific guidance to enable effective knowledge transfer. In Figure 1, we show the … ’
prediction when using KL divergence (see Figure 1). We propose a novel knowledge distillation …
prediction when using KL divergence (see Figure 1). We propose a novel knowledge distillation …
Inter-region affinity distillation for road marking segmentation
… We study the problem of distilling knowledge from a large deep … In this work, we explore
a novel knowledge distillation (KD) ap… Apart from model predictions, we also show the deep …
a novel knowledge distillation (KD) ap… Apart from model predictions, we also show the deep …
相关搜索
- binding affinity prediction
- knowledge distillation semantic segmentation
- knowledge distillation dense prediction
- knowledge distillation multi-task learning
- knowledge distillation drug target
- affinity prediction model
- transformer knowledge distillation
- self knowledge distillation
- multi-level knowledge distillation
- drug target affinity prediction
- interpretable drug affinity prediction
- hybrid graph affinity prediction
- graph mining affinity prediction
- representation learning affinity prediction
- self attention mechanism affinity prediction
- information fusion affinity prediction