在模仿中精进数据可视化_R语言对相关性数据进行网络可视化
❝未来公众号的更新内容,暂且没有想好,走一步看一步吧。
❝
在模仿中精进数据可视化
该系列推文中,我们将从各大顶级学术期刊的Figure
入手,
解读文章的绘图思路,
模仿文章的作图风格,
构建适宜的绘图数据,
并且将代码应用到自己的实际论文中。
绘图缘由:小伙伴们总会展示出一些非常好看且精美的图片。我大概率会去学习和复现一下。其实每个人的时间和精力都非常有限和异常宝贵的。之所以我会去做,主要有以下原因:
图片非常好看,我自己看着也手痒痒 图片我自己在Paper也用的上,储备着留着用 保持了持续学习的状态
直接上代码:
加载R
包
rm(list = ls())
####----load R Package----####
library(tidyverse)
library(readxl)
library(psych)
library(ggraph)
library(tidygraph)
library(patchwork)
source("R/corrlelation_analysis.R")
source("R/create_network.R")
加载数据
####----load Data----####
ASV_B <- read_xlsx(path = "Input/ASV_B.xlsx", col_names = T) %>%
tibble::column_to_rownames(var = "ASV") %>%
t() %>%
as.data.frame()
ASV_F <- read_xlsx(path = "Input/ASV_F.xlsx", col_names = T) %>%
tibble::column_to_rownames(var = "ASV") %>%
t() %>%
as.data.frame()
####----corralation analysis----####
cor_out_odata <- corr.test(ASV_B, ASV_F)
cor_out <- correlation_analysis(cor_out_odata = cor_out_odata)
####----Network----####
graph <- create_network(cor_out = cor_out)
node_df <- data.frame(
name = c(unique(cor_out$from), unique(cor_out$to)),
type = c(rep("Bacterial", length(unique(cor_out$from))),
rep("Fungle", length(unique(cor_out$to)))))
name_df <- cor_out %>%
group_by(from) %>%
summarise(to = list(unique(to))) %>%
ungroup() %>%
mutate(
combined = map2(from, to, ~ c(.x, .y))
) %>%
pull(combined) %>%
unlist() %>% unique()
type <- node_df$type[match(name_df, node_df$name)]
name_df <- data.frame(name=name_df,
type=type)
graph <- tbl_graph(nodes = name_df,
edges = cor_out,
directed = T) %>%
tidygraph::mutate(Popularity = centrality_degree(mode = 'out'))
开始绘图
####----Plot----####
p <- ggraph(graph, layout = "linear",circular=T) +
geom_edge_arc(aes(color=Correlated,
linetype = Correlated,
alpha = abs(Correlation)),
width=1) +
geom_node_point(aes(fill = type, size = Popularity),shape = 21) +
geom_node_text(aes(x = 1.1 * x,
y = 1.1 * y,
label= name,
color=type,
angle = -((-node_angle(x, y) + 90) %% 180) + 90),
hjust='outward',
size=6) +
scale_edge_color_manual(values = c("#dd3497", "#41ab5d")) +
scale_fill_manual(values = c("#FE9929", "#CCEBC5")) +
scale_size_continuous(range = c(10,20)) +
coord_fixed(clip = "off") +
theme_void()+
theme(
plot.margin = margin(3,3,3,3,"cm"),
legend.position = c(1.1, 0.5)
)
p
ggsave(filename = "Output/p.pdf",
plot = p,
height = 12,
width = 12.5)
版本信息
####----sessionInfo----####
R version 4.3.0 (2023-04-21)
Platform: x86_64-apple-darwin20 (64-bit)
Running under: macOS 15.1.1
Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/4.3-x86_64/Resources/lib/libRlapack.dylib; LAPACK version 3.11.0
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
time zone: Asia/Shanghai
tzcode source: internal
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] patchwork_1.2.0.9000 tidygraph_1.2.3 ggraph_2.1.0 psych_2.3.9 readxl_1.4.3
[6] lubridate_1.9.3 forcats_1.0.0 stringr_1.5.1 dplyr_1.1.4 purrr_1.0.2
[11] readr_2.1.5 tidyr_1.3.1 tibble_3.2.1 ggplot2_3.5.1 tidyverse_2.0.0
loaded via a namespace (and not attached):
[1] viridis_0.6.4 utf8_1.2.4 generics_0.1.3 stringi_1.8.3 lattice_0.22-5 hms_1.1.3
[7] digest_0.6.37 magrittr_2.0.3 grid_4.3.0 timechange_0.2.0 cellranger_1.1.0 ggrepel_0.9.6
[13] gridExtra_2.3 fansi_1.0.6 viridisLite_0.4.2 scales_1.3.0 tweenr_2.0.3 textshaping_0.3.7
[19] mnormt_2.1.1 cli_3.6.3 crayon_1.5.2 graphlayouts_1.0.2 rlang_1.1.4 polyclip_1.10-7
[25] munsell_0.5.1 withr_3.0.1 tools_4.3.0 parallel_4.3.0 tzdb_0.4.0 colorspace_2.1-1
[31] vctrs_0.6.5 R6_2.5.1 lifecycle_1.0.4 MASS_7.3-60 ragg_1.2.6 pkgconfig_2.0.3
[37] pillar_1.9.0 gtable_0.3.5 glue_1.8.0 Rcpp_1.0.13 systemfonts_1.1.0 ggforce_0.4.2
[43] tidyselect_1.2.1 rstudioapi_0.15.0 farver_2.1.2 nlme_3.1-163 igraph_2.0.3 labeling_0.4.3
[49] compiler_4.3.0
历史绘图合集
公众号推文一览
进化树合集
环状图
散点图
基因家族合集
换一个排布方式:
首先查看基础版热图:
然后再看进阶版热图:
基因组共线性
WGCNA ggplot2版本
其他科研绘图
合作、联系和交流
有很多小伙伴在后台私信作者,非常抱歉,我经常看不到导致错过,请添加下面的微信联系作者,一起交流数据分析和可视化。