Image 1 (IMAGE)
Caption
REFERS workflow. Researchers forward radiographs of the k-th patient study through the radiograph transformer, fuse representations of different views using an attention mechanism, and use report generation and study–report representation consistency reinforcement to exploit the information in radiology reports. Graph a, an overview of the whole pipeline. Graph b, the architecture of the radiograph transformer. Graph c, attention for view fusion is elaborated. MLP stands for a multi-layer perceptron. Graph d, two supervision tasks are shown, report generation and study–report representation consistency reinforcement.
Credit
The University of Hong Kong
Usage Restrictions
This work is licensed for non-commercial, non-exclusive, one-time usage with attribution to the copyright holder.
License
Original content