Compare commits

..

340 Commits

Author SHA1 Message Date
528309ab84 Merge pull request 'kashin_maxim_lab_5' (#124) from kashin_maxim_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#124
2024-11-20 22:45:51 +04:00
0814d8533d Merge pull request 'kashin_maxim_lab_4' (#123) from kashin_maxim_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#123
2024-11-20 22:45:28 +04:00
354ee2679e Merge pull request 'yakovleva_yulia_lab_8 is ready' (#122) from yakovleva_yulia_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#122
2024-11-20 22:45:02 +04:00
d302bd2213 Merge pull request 'yakovleva_yulia_lab_7 is ready' (#121) from yakovleva_yulia_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#121
2024-11-20 22:44:39 +04:00
2aed7bf385 Merge pull request 'yakovleva_yulia_lab_6 is ready' (#120) from yakovleva_yulia_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#120
2024-11-20 22:44:06 +04:00
d4e24db25e Merge pull request 'kadyrov_aydar_lab_5' (#119) from kadyrov_aydar_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#119
2024-11-20 22:43:23 +04:00
c0ca1d4bb5 Merge pull request 'kadyrov_aydar_lab_4' (#117) from kadyrov_aydar_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#117
2024-11-20 22:43:05 +04:00
6eeb90ea45 Merge pull request 'tukaeva_alfiya_lab_8' (#116) from tukaeva_alfiya_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#116
2024-11-20 22:38:42 +04:00
bc2d7cb2f6 Merge pull request 'tukaeva_alfiya_lab_7' (#115) from tukaeva_alfiya_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#115
2024-11-20 22:37:46 +04:00
e1da6f26ab Merge pull request 'tukaeva_alfiya_lab_6' (#114) from tukaeva_alfiya_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#114
2024-11-20 22:37:01 +04:00
e5df53b5c2 Merge pull request 'turner_ilya_lab_2' (#113) from turner_ilya_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#113
2024-11-20 22:36:40 +04:00
c98770752e Merge pull request 'mochalov_danila_lab_3' (#112) from mochalov_danila_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#112
2024-11-20 22:36:16 +04:00
a800c3df86 Merge pull request 'Bazunov Andrew Lab 4' (#111) from bazunov_andrew_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#111
2024-11-20 22:35:35 +04:00
a51e33a201 Merge pull request 'turner_ilya_lab_1' (#110) from turner_ilya_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#110
2024-11-20 22:34:54 +04:00
a9af84010a Merge pull request 'Bazunov Andrew lab3' (#109) from bazunov_andrew_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#109
2024-11-20 22:34:26 +04:00
3645d0c1cd Merge pull request 'yakovleva_yulia_lab_5 is ready' (#107) from yakovleva_yulia_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#107
Reviewed-by: Alexey <a.zhelepov@mail.ru>
2024-11-20 22:33:27 +04:00
08f2f63ad4 Готово 2024-10-27 19:42:27 +04:00
e4e3748a3d Выполнено 2024-10-27 19:09:16 +04:00
JulYakJul
5e522fbcc0 yakovleva_yulia_lab_8 is ready 2024-10-27 15:10:30 +04:00
JulYakJul
cae7189c1e fix 2024-10-27 14:06:02 +04:00
JulYakJul
2bfc8a0a43 yakovleva_yulia_lab_7 is ready 2024-10-27 14:02:15 +04:00
JulYakJul
1f89960672 fix 2024-10-27 13:06:24 +04:00
JulYakJul
ffb4c2a8a4 yakovleva_yulia_lab_6 is ready 2024-10-27 13:04:11 +04:00
NAP
1dc621e0be kadyrov_aydar_lab_5 2024-10-27 02:16:28 +04:00
NAP
11c62d9bf7 kadyrov_aydar_lab_5 2024-10-27 02:13:51 +04:00
NAP
03910a9a3f kadyrov_aydar_lab_4 2024-10-27 01:53:34 +04:00
f7d483196c tukaeva_alfiya_lab_8 is ready 2024-10-26 23:16:19 +04:00
545377f948 tukaeva_alfiya_lab_7 fix 2024-10-26 22:58:30 +04:00
bb867da520 tukaeva_alfiya_lab_7 is ready 2024-10-26 22:41:45 +04:00
c4a260ebda tukaeva_alfiya_lab_6 is ready 2024-10-26 22:26:14 +04:00
88392a8041 turner_ilya_lab_2 is ready 2024-10-26 21:09:43 +04:00
JulYakJul
400de30b49 fix 2024-10-26 20:04:39 +04:00
96a4e6ac43 mochalov_danila_lab_3 is ready 2024-10-26 18:18:28 +04:00
Bazunov Andrew Igorevich
03c52d0c76 Complete lab4 2024-10-26 17:51:52 +04:00
6dd4835f54 turner_ilya_lab_1 is ready 2024-10-26 17:34:47 +04:00
Bazunov Andrew Igorevich
5187005e6a complete lab 3 2024-10-26 14:46:33 +04:00
3b9698ac38 Merge pull request 'tsukanova_irina_lab_5' (#108) from tsukanova_irina_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#108
2024-10-26 13:01:34 +04:00
a456344432 Merge pull request 'rogashova_ekaterina_lab_3' (#106) from rogashova_ekaterina_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#106
2024-10-26 13:00:05 +04:00
383a5e3b25 Merge pull request 'kadyrov_aydar_lab_3' (#105) from kadyrov_aydar_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#105
2024-10-26 12:59:18 +04:00
2834efbbce Merge pull request 'kadyrov_aydar_lab_2' (#104) from kadyrov_aydar_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#104
2024-10-26 12:58:55 +04:00
decc46b37c Merge pull request 'tukaeva_alfiya_lab_5' (#103) from tukaeva_alfiya_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#103
2024-10-26 12:58:23 +04:00
a41e76795f Merge pull request 'artamonova_tatyana_lab_2 is ready' (#102) from artamonova_tatyana_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#102
2024-10-26 12:57:40 +04:00
bcfec37329 Merge pull request 'bogdanov_dmitry_lab_5' (#101) from bogdanov_dmitry_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#101
2024-10-26 12:56:47 +04:00
e17b0b0d61 Merge pull request 'bogdanov_dmitry_lab_4' (#100) from bogdanov_dmitry_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#100
2024-10-26 12:56:28 +04:00
62290fc43d Merge pull request 'zhimolostnova_anna_lab_6' (#95) from zhimolostnova_anna_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#95
2024-10-26 12:56:04 +04:00
0b5fb8da2e Merge pull request 'zhimolostnova lab 5 complete' (#94) from zhimolostnova_anna_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#94
2024-10-26 12:53:57 +04:00
9c6ef7e89e Merge pull request 'vaksman_valeria_lab_6' (#91) from vaksman_valeria_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#91
2024-10-26 12:52:19 +04:00
e763cf36e2 Merge pull request 'yakovleva_yulia_lab_4 is ready' (#90) from yakovleva_yulia_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#90
2024-10-26 12:51:56 +04:00
adf3f384a3 Merge pull request 'dozorova_alena_lab_8' (#99) from dozorova_alena_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#99
2024-10-26 12:50:30 +04:00
5ae6cd3cf1 Merge pull request 'dozorova_alena_lab_7' (#98) from dozorova_alena_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#98
2024-10-26 12:42:04 +04:00
daf3742ce6 Merge pull request 'zhimolostnova lab 8 complete' (#97) from zhimolostnova_anna_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#97
2024-10-26 12:35:58 +04:00
fb37a53f66 Merge pull request 'zhimolostnova lab 7 complete' (#96) from zhimolostnova_anna_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#96
2024-10-26 12:35:18 +04:00
23e035f9b2 Merge pull request 'vaksman_valeria_lab_8' (#93) from vaksman_valeria_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#93
2024-10-26 12:31:26 +04:00
556d8cf262 Merge pull request 'vaksman_valeria_lab_7' (#92) from vaksman_valeria_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#92
2024-10-26 12:30:20 +04:00
419790f5df Merge pull request 'borschevskaya_anna_lab_8' (#89) from borschevskaya_anna_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#89
2024-10-26 12:27:12 +04:00
54a9b8a778 Merge pull request 'kadyrov_aydar_lab_1' (#88) from kadyrov_aydar_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#88
2024-10-26 12:23:44 +04:00
3aeae245fa Merge pull request 'lazarev_andrey_lab_3' (#87) from lazarev_andrey_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#87
2024-10-26 12:23:21 +04:00
382273ccb8 Merge pull request 'borschevskaya_anna_lab_7 is ready' (#86) from borschevskaya_anna_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#86
2024-10-26 12:20:06 +04:00
4a37f55328 Merge pull request 'rogashova_ekaterina_lab_2' (#85) from rogashova_ekaterina_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#85
2024-10-26 12:14:14 +04:00
4e32398903 Merge pull request 'artamonova_tatyana_lab_1' (#84) from artamonova_tatyana_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#84
2024-10-26 12:13:43 +04:00
e69819aedd Merge pull request 'tukaeva_alfiya_lab_4 is ready' (#83) from tukaeva_alfiya_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#83
2024-10-26 12:12:59 +04:00
d9c4402ec9 Merge pull request 'kuzarin_maxim_lab_8' (#81) from kuzarin_maxim_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#81
2024-10-26 12:07:16 +04:00
93687ad850 Merge pull request 'kuzarin_maxim_lab_7' (#80) from kuzarin_maxim_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#80
2024-10-26 11:37:45 +04:00
4528bcd22c Merge pull request 'emelyanov_artem_lab_8' (#79) from emelyanov_artem_lab_8 into main
Reviewed-on: Alexey/DAS_2024_1#79
2024-10-26 11:35:53 +04:00
eef1d03249 Merge pull request 'emelyanov_artem_lab_7' (#78) from emelyanov_artem_lab_7 into main
Reviewed-on: Alexey/DAS_2024_1#78
2024-10-26 11:34:21 +04:00
7e09109cd2 Merge pull request 'emelyanov_artem_lab_6' (#77) from emelyanov_artem_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#77
2024-10-26 11:33:34 +04:00
f46724e5cf Merge pull request 'dozorova_alena_lab_6' (#76) from dozorova_alena_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#76
2024-10-26 11:27:22 +04:00
72b0b63e58 видео 2024-10-25 21:03:05 +04:00
fd54e426b5 осталось видео 2024-10-25 20:52:46 +04:00
JulYakJul
a5f0403627 yakovleva_yulia_lab_5 is ready 2024-10-25 18:12:36 +04:00
ad8894c0ca сделано, осталось дописать ридми 2024-10-25 16:56:56 +04:00
edea94a4f2 Готово 2024-10-25 14:10:33 +04:00
NAP
5700e75965 kadyrov_aydar_lab_3 2024-10-25 01:10:46 +04:00
NAP
9e9711f004 kadyrov_aydar_lab_2 2024-10-24 20:07:47 +04:00
014845df45 tukaeva_alfiya_lab_5 is ready 2024-10-24 15:46:08 +04:00
636592bbac artamonova_tatyana_lab_2 is ready 2024-10-23 21:29:01 +04:00
the
6711e8b0f6 Lab5 2024-10-23 18:36:31 +04:00
the
c91aa6e1f3 Исправление 2024-10-23 15:58:26 +04:00
the
d340d34c0b README, исправления, изображения 2024-10-23 15:56:48 +04:00
the
aaff3b8183 Lab4 2024-10-23 14:11:04 +04:00
06a7114499 lab 8 complete 2024-10-22 20:52:00 +03:00
0246f32bcf lab 7 complete 2024-10-22 20:19:09 +03:00
417368d25e fix readme 2024-10-22 19:31:03 +03:00
20a39fa9a5 lab 6 complete 2024-10-22 19:30:00 +03:00
fb15f87160 lab 5 complete 2024-10-22 18:23:30 +03:00
f86dfba785 lab8 wow 2024-10-21 21:21:28 +04:00
e874c69b62 lab7 is ready 2024-10-21 21:17:41 +04:00
6f0726185a lab six is ready yep 2024-10-21 21:10:24 +04:00
JulYakJul
b4b0ef7730 fix readme 2024-10-21 14:35:40 +04:00
JulYakJul
4d51941016 fix readme 2024-10-21 14:34:02 +04:00
JulYakJul
a07b272c79 yakovleva_yulia_lab_4 is ready 2024-10-21 14:31:58 +04:00
7cb94c14b0 borschevskaya_anna_lab_8 is ready 2024-10-21 08:52:52 +04:00
NAP
506d544060 kadyrov_aydar_lab_1 2024-10-21 02:36:18 +04:00
1ef9e02d32 lazarev_andrey_lab_3 is ready 2024-10-20 23:32:58 +04:00
ff8a87ebb8 borschevskaya_anna_lab_7 is ready 2024-10-20 22:37:56 +04:00
740d49d368 Готово 2024-10-20 21:59:10 +04:00
df1b8bd8ce artamonova_tatyana_lab_1 is ready 2024-10-20 19:29:17 +04:00
7549429b6b artamonova_tatyana_lab_1 is ready 2024-10-20 19:25:21 +04:00
00d9e2409a tukaeva_alfiya_lab_4 is ready 2024-10-20 19:04:32 +04:00
098cb9b9ad Обновить kuzarin_maxim_lab_8/README.md
маленькая стилистическая доработка
2024-10-19 19:44:12 +04:00
af39fdc505 Текст написан, нужно проверить отображение Md 2024-10-19 18:42:32 +03:00
ef603a8056 Обновить kuzarin_maxim_lab_7/README.md
Небольшая проблема с двум \n
2024-10-19 19:36:31 +04:00
c8b3124074 Добавлено эссе в виде MD файла 2024-10-19 18:34:37 +03:00
ce853de348 feature: completed lab 8 2024-10-19 18:40:59 +04:00
c3ac60eaa2 fix: delete .idea 2024-10-19 18:03:24 +04:00
e12438b727 feature: completed lab 7 2024-10-19 18:00:48 +04:00
aa54f9187f написали эссе 2024-10-19 14:47:04 +04:00
b1d8660774 + 2024-10-19 14:18:21 +04:00
6c66654acc feature: deleted lab 6 2024-10-19 14:17:40 +04:00
1d9c308bb4 правим в принципе разметку 2024-10-19 14:17:25 +04:00
a64b6c7329 feature: completed lab 6 2024-10-19 14:17:22 +04:00
7ec5c45faa попытка выравнивания 2024-10-19 14:06:09 +04:00
340dc6aa19 добавляем эссе 2024-10-19 14:04:46 +04:00
a152275bb7 Merge branch 'main' into dozorova_alena_lab_6 2024-10-19 13:17:33 +04:00
6e3ec51fe7 Merge branch 'main' into dozorova_alena_lab_6 2024-10-19 13:17:09 +04:00
131dc39f6c Merge pull request 'borschevskaya_anna_lab_6 is ready' (#75) from borschevskaya_anna_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#75
2024-10-19 13:08:27 +04:00
d82f47e04c Merge pull request 'emelyanov_artem_lab_5' (#74) from emelyanov_artem_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#74
2024-10-19 12:56:19 +04:00
3175352d02 Merge pull request 'emelyanov_artem_lab_4' (#73) from emelyanov_artem_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#73
2024-10-19 12:49:26 +04:00
2e86e68e12 Merge pull request 'aleikin_artem_lab_1' (#72) from aleikin_artem_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#72
2024-10-19 12:46:13 +04:00
63dd60f20e Merge pull request 'bondarenko_max_lab_1' (#71) from bondarenko_max_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#71
2024-10-19 12:43:27 +04:00
63e031ef17 Merge pull request 'vaksman_valeria_lab_5' (#70) from vaksman_valeria_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#70
2024-10-19 12:33:16 +04:00
5fdabedcd6 Merge pull request 'kuzarin_maxim_lab_6' (#69) from kuzarin_maxim_lab_6 into main
Reviewed-on: Alexey/DAS_2024_1#69
2024-10-19 12:30:30 +04:00
9eadb70f85 fix link 2024-10-19 12:28:06 +04:00
5fd241a980 Merge pull request 'dozorova_alena_lab_5' (#67) from dozorova_alena_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#67
2024-10-19 12:26:46 +04:00
4f53dff75f Merge branch 'dozorova_alena_lab_5' of https://git.is.ulstu.ru/Alexey/DAS_2024_1 into dozorova_alena_lab_5 2024-10-19 12:24:25 +04:00
57b7675030 fix link 2024-10-19 12:24:07 +04:00
b1c16dc76c Merge pull request 'bogdanov_dmitry_lab_3' (#68) from bogdanov_dmitry_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#68
2024-10-19 12:23:02 +04:00
309911ed75 Merge pull request 'rogashova_ekaterina_lab_1 is ready' (#66) from rogashova_ekaterina_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#66
2024-10-19 12:08:44 +04:00
d23e808325 Merge pull request 'lazarev_andrey_lab_2' (#65) from lazarev_andrey_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#65
2024-10-19 12:05:03 +04:00
4c974bfb51 Merge pull request 'tsukanova_irina_lab_4' (#64) from tsukanova_irina_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#64
2024-10-19 12:00:40 +04:00
b573569a97 borschevskaya_anna_lab_6 is ready 2024-10-19 10:46:15 +04:00
60c79b64fb feature: deleted lab 5 2024-10-18 17:42:21 +04:00
07105e81a0 feature: completed lab 5 2024-10-18 17:41:49 +04:00
JulYakJul
0ebd562be2 Merge branch 'main' into yakovleva_yulia_lab_4 2024-10-18 17:02:27 +04:00
JulYakJul
22a3917d28 work 3 done 2024-10-18 16:59:19 +04:00
46b94ea885 feature: completed lab 4 2024-10-18 16:27:06 +04:00
JulYakJul
94b8ba783c work 2.2 done 2024-10-18 16:24:18 +04:00
JulYakJul
060bd2321e work 2 done 2024-10-18 16:11:22 +04:00
JulYakJul
a8f1b39dd7 work 1 done 2024-10-18 15:42:10 +04:00
d3a7046f97 aleikin_artem_lab1 is ready 2024-10-18 00:09:32 +04:00
06d65650ab aleikin_artem_lab1 is ready 2024-10-18 00:05:35 +04:00
992a169c9b Merge branch 'main' into bondarenko_max_lab_1 2024-10-17 23:22:03 +04:00
b82a13c106 bondarenko_max_lab_1 is ready 2024-10-17 23:20:01 +04:00
e33ffef85e fix 2024-10-17 22:43:46 +04:00
9362e62999 йееес lab 5 is ready 2024-10-17 20:01:43 +04:00
430fad9ef4 Merge pull request 'borschevskaya_anna_lab_5 is ready' (#63) from borschevskaya_anna_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#63
2024-10-16 16:50:14 +04:00
d0aedf8495 Merge pull request 'klyushenkova_ksenia_lab_1 is ready' (#62) from klyushenkova_ksenia_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#62
2024-10-16 16:49:29 +04:00
effd849042 Merge pull request 'emelaynov_artem_lab_3' (#61) from emelaynov_artem_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#61
2024-10-16 16:48:48 +04:00
55e18b6a64 Merge pull request 'vaksman_valeria_lab_3' (#60) from vaksman_valeria_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#60
2024-10-16 16:47:28 +04:00
5a7409d60c Merge pull request 'mochalov_danila_lab_2' (#59) from mochalov_danila_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#59
2024-10-16 16:46:55 +04:00
265cf478bf Merge pull request 'tukaeva_alfiya_lab_3 is ready' (#58) from tukaeva_alfiya_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#58
2024-10-16 16:45:44 +04:00
c6f29a13a1 Merge pull request 'vaksman_valeria_lab_4' (#57) from vaksman_valeria_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#57
2024-10-16 16:45:11 +04:00
4103a23984 Merge pull request 'Presnyakova Victoria Lab2' (#56) from presnyakova_victoria_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#56
2024-10-16 16:20:58 +04:00
f8ac151629 Merge pull request 'zhimolostnova_anna_lab 4 complete' (#55) from zhimolostnova_anna_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#55
2024-10-16 15:02:32 +04:00
13b5dfc707 Merge branch 'main' into dozorova_alena_lab_5 2024-10-16 14:31:12 +04:00
5d3517c2b0 Merge pull request 'dozorova_alena_lab_4' (#49) from dozorova_alena_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#49
2024-10-16 14:26:29 +04:00
f3bbfb2efd rogashova_ekaterina_lab_1 is ready 2024-10-14 23:20:57 +04:00
3c6c7f47e8 second lab done 2024-10-14 16:27:36 +04:00
dc7c2c9694 видео 2024-10-14 16:09:46 +04:00
481631cda5 Merge pull request 'yakovleva_yulia_lab_3' (#54) from yakovleva_yulia_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#54
2024-10-14 15:48:22 +04:00
9b4f9b608c все готово, осталось сделать видео 2024-10-14 15:37:29 +04:00
3b842c2228 Merge pull request 'kalyshev_yan_lab_2 is ready' (#53) from kalyshev_yan_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#53
2024-10-14 15:18:08 +04:00
c4b8f4b4de Merge pull request 'kuzarin_maxim_lab_5' (#52) from kuzarin_maxim_lab_5 into main
Reviewed-on: Alexey/DAS_2024_1#52
2024-10-14 12:29:54 +04:00
85567eea48 Merge pull request 'bogdanov_dmitry_lab_2' (#51) from bogdanov_dmitry_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#51
2024-10-14 12:19:26 +04:00
ea8da8c665 Merge pull request 'borschevskaya_anna_lab_4 is ready' (#50) from borschevskaya_anna_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#50
2024-10-14 11:03:34 +04:00
2497e3c742 borschevskaya_anna_lab_5 is ready 2024-10-13 11:03:08 +04:00
Pineapple
a628469960 klyushenkova_ksenia_lab_1 is ready 2024-10-12 23:40:16 +04:00
f107797a2d fix: deleted trash 2024-10-12 16:47:05 +04:00
98e9047b45 feature: completed lab 3 2024-10-12 16:45:56 +04:00
53f96303bc commit for commit 2024-10-11 19:32:26 +04:00
eb7211c6f9 Init. 2024-10-11 19:19:17 +04:00
66ffe827f8 mochalov_danila_lab_2 is ready 2024-10-11 05:18:46 +04:00
a0209b612e tukaeva_alfiya_lab_3 is ready 2024-10-11 01:01:25 +04:00
1f72d4dc70 ох уж этот редми 2024-10-10 21:03:09 +04:00
b351431f51 lab4 now is ready 2024-10-10 21:02:25 +04:00
56baf52b61 lab4 ready 2024-10-10 21:00:15 +04:00
f5ec3f1767 lab2 2024-10-10 18:52:01 +04:00
77790c37fb lab 4 complete 2024-10-09 17:12:11 +03:00
the
735a403027 Добавлен README 2024-10-09 16:38:42 +04:00
the
c67049687b Done 2024-10-09 16:17:26 +04:00
022e2dc49e + 2024-10-08 23:46:12 +04:00
8f24aad349 + 2024-10-08 23:45:44 +04:00
a54e13f7ee + 2024-10-08 23:45:00 +04:00
1bb988ea2f dozorova_alena_lab_6 2024-10-08 23:43:24 +04:00
f7668394b0 dozorova_alena_lab_5 2024-10-08 22:46:32 +04:00
JulYakJul
a6a247cabf delete trash 2024-10-08 17:02:29 +04:00
JulYakJul
f5194bf885 Create README.md 2024-10-08 16:56:07 +04:00
JulYakJul
12cd98aa7d yakovleva_yulia_lab_3 is ready 2024-10-08 16:30:55 +04:00
85b809333b Merge pull request 'dolgov_dmitriy_lab_2' (#48) from dolgov_dmitriy_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#48
2024-10-07 23:44:23 +04:00
5e3c9c0d5b Merge pull request 'kashin_maxim_lab_3' (#47) from kashin_maxim_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#47
2024-10-07 23:39:34 +04:00
daf24d364d Merge pull request 'lab 3 complete' (#46) from zhimolostnova_anna_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#46
2024-10-07 23:36:16 +04:00
6c13deb231 Merge pull request 'vasina_ekaterina_lab_1 is ready' (#45) from vasina_ekaterina_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#45
2024-10-07 23:33:38 +04:00
543d41d9c3 Merge pull request 'tsukanova_irina_lab_3' (#44) from tsukanova_irina_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#44
2024-10-07 23:32:24 +04:00
153684c403 Merge pull request 'balakhonov_danila_lab_2' (#43) from balakhonov_danila_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#43
2024-10-07 23:29:07 +04:00
0708b01560 Merge pull request 'bazunov_andrew_lab_2' (#42) from bazunov_andrew_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#42
2024-10-07 23:28:10 +04:00
8a6932ff20 Merge pull request 'bogdanov_dmitry_lab_1' (#41) from bogdanov_dmitry_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#41
2024-10-07 23:24:48 +04:00
35cf16824d Merge pull request 'lazarev_andrey_lab_1 done' (#40) from lazarev_andrey_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#40
2024-10-07 23:09:33 +04:00
ac3dc2e566 Merge pull request 'vaksman_valeria_lab_2' (#39) from vaksman_valeria_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#39
2024-10-07 23:09:07 +04:00
2f46c05849 Merge pull request 'borschevskaya_anna_lab_3 is ready' (#38) from borschevskaya_anna_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#38
2024-10-07 23:07:51 +04:00
84cb26162c Merge pull request 'kalyshev_yan_lab_1' (#37) from kalyshev_yan_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#37
2024-10-07 23:07:09 +04:00
129b991712 Merge pull request 'tukaeva_alfiya_lab_2_fix' (#36) from tukaeva_alfiya_lab_2_fix into main
Reviewed-on: Alexey/DAS_2024_1#36
2024-10-07 23:06:38 +04:00
ffecef8fa3 Merge pull request 'dozorova_alena_lab_3' (#35) from dozorova_alena_lab_3_fix into main
Reviewed-on: Alexey/DAS_2024_1#35
2024-10-07 23:06:15 +04:00
1289d67a62 Merge pull request 'kuzarin_maxim_lab_4' (#34) from kuzarin_maxim_lab_4 into main
Reviewed-on: Alexey/DAS_2024_1#34
2024-10-07 23:01:48 +04:00
b09f3ea844 Merge pull request 'kashin_maxim_lab_2' (#31) from kashin_maxim_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#31
2024-10-07 23:00:05 +04:00
2f368ffb07 Обновить dolgov_dmitriy_lab_2/README.md 2024-10-07 15:31:54 +04:00
ead06782ad Обновить dolgov_dmitriy_lab_2/README.md 2024-10-07 14:12:39 +04:00
b2ac5eba9a Обновить dolgov_dmitriy_lab_2/README.md 2024-10-07 14:12:19 +04:00
0c0a47549a Обновить dolgov_dmitriy_lab_2/README.md 2024-10-07 14:11:37 +04:00
Аришина)
84e8cac198 Лабрадор 2024-10-07 14:09:04 +04:00
JulYakJul
3db4a0fcd4 Admin 2024-10-07 11:28:34 +04:00
a4f9cf13cc borschevskaya_anna_lab_4 is ready 2024-10-06 17:16:44 +04:00
bde242318f Странный комменатрии оставленный в 5 утра... 2024-10-06 00:35:20 +04:00
761cc83ebd Пабеда... 2024-10-06 00:33:30 +04:00
4699fda797 Фух, готово. Осталось ридми. 2024-10-05 23:40:53 +04:00
940cc6757f lab 3 complete 2024-10-05 21:14:57 +03:00
1e9bdf2806 Почти сделано. Исправить связи и сделать readme.md 2024-10-05 22:02:08 +04:00
evasina2312@gmail.com
9bd14a60b4 vasina_ekaterina_lab_1 is ready 2024-10-05 22:00:12 +04:00
5aa2cae670 init 2024-10-05 01:24:02 +04:00
the
75b118ba6e Исправлен README, готово 2024-10-04 15:52:10 +04:00
the
d8441a0989 Чистый неподкупный рабочий код 2-й лабораторной 2024-10-04 15:49:14 +04:00
the
1213b5db3c Чистый неподкупный рабочий код 2-й лабораторной 2024-10-04 14:33:05 +04:00
281d30a89e lab 3 done 2024-10-03 16:17:37 +04:00
f2093f376c небольшие правки 2024-10-02 23:00:20 +04:00
80c666d6b0 fast fix: главный ридми чуть изменен 2024-10-02 22:08:32 +04:00
a589994db5 fix: изменен главный ридми
Добавлена ссылка и указание, в какой папке лежат выходные данные
2024-10-02 22:03:34 +04:00
38ce2bb347 fix: изменен первый сервис
он сохранял не в тот файл...
2024-10-02 21:40:44 +04:00
f25af86d9c add: добавлены докеригноры 2024-10-02 21:40:15 +04:00
45eb2b72c5 fix: докер композ файл
неправильные табы...
2024-10-02 21:07:18 +04:00
77bdc1d8e9 fix: изменен докерфайл для второго сервиса, а также отредактирован общий ридми 2024-10-02 20:53:57 +04:00
da6593c4d0 Создание проектов, докерфайлов и докер композа 2024-10-02 20:46:38 +04:00
Bazunov Andrew Igorevich
eeac04be49 Complete lab 2 2024-10-02 17:58:40 +04:00
the
0a73e2d5d4 bogdanov_dmitry_lab_1 is ready 2024-10-02 15:11:26 +04:00
1565e49462 lazarev_andrey_lab_1 done 2024-10-02 15:06:10 +04:00
858ea65e71 что-то работает 2024-10-02 14:16:54 +04:00
0f898b968d tot 2024-10-02 10:47:53 +04:00
8b96102dbd еще и еще запросы 2024-10-01 23:05:44 +04:00
82ecad71f4 is ready (ура) 2024-10-01 19:42:31 +04:00
8a96320fd5 Merge pull request 'bazunov_andrew_lab_1' (#33) from bazunov_andrew_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#33
2024-09-30 22:18:50 +04:00
bd25930973 Merge pull request 'tsukanova_irina_lab_2' (#32) from tsukanova_irina_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#32
2024-09-30 22:18:28 +04:00
Zyzf
f0b48bba28 kalyshev_yan_lab_2 is ready 2024-09-29 20:05:33 +04:00
ef68b506b8 borschevskaya_anna_lab_3 is ready 2024-09-29 15:49:34 +04:00
Zyzf
8efc2422cf kalyshev_yan_lab_1 is ready 2024-09-29 12:39:22 +04:00
c3537b5abe немного изменений 2024-09-27 23:26:36 +04:00
a0ef65e0f9 сервис авторов 2024-09-27 16:53:32 +04:00
23087c87ea Обновить kuzarin_maxim_lab_6/README.md 2024-09-26 23:14:31 +04:00
5a6580ff8c Фикс README 2024-09-26 22:14:06 +03:00
5f6472b5ff Лаба реализована. Нужно всё проверить в контексте описания 2024-09-26 22:12:33 +03:00
e1950c80ea поправили readme 2024-09-26 22:33:36 +04:00
5586bec4b8 готовая работа 2024-09-26 22:31:01 +04:00
6815b2e560 tukaeva_alfiya_lab_2 is fix 2024-09-26 15:53:43 +04:00
48b7fbd900 tukaeva_alfiya_lab_2 is ready 2024-09-26 15:44:30 +04:00
080625d270 исправляем реквест 2024-09-26 11:04:52 +04:00
37996c249a Merge pull request 'dolgov_dmitriy_lab_1' (#29) from dolgov_dmitriy_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#29
2024-09-26 10:25:37 +04:00
9456d4fe01 Merge pull request 'borschevskaya_anna_lab_2 is ready' (#25) from borschevskaya_anna_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#25
2024-09-26 10:20:55 +04:00
c14e105db5 Merge pull request 'presnyakova_victoria_lab_1' (#24) from presnyakova_victoria_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#24
2024-09-26 09:59:12 +04:00
4d1e900721 Merge pull request 'yakovleva_yulia_lab_2' (#20) from yakovleva_yulia_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#20
Reviewed-by: Alexey <a.zhelepov@mail.ru>
2024-09-26 08:45:08 +04:00
7184d6d728 Обновить bazunov_andrew_lab_1/README.md 2024-09-25 15:44:25 +04:00
Bazunov Andrew Igorevich
6e7055efa4 update readme 2024-09-25 15:37:57 +04:00
Bazunov Andrew Igorevich
9e40adc53c edit docker compose 2024-09-25 15:19:28 +04:00
Bazunov Andrew Igorevich
4a36528cc7 Complete docker compose 2024-09-25 12:35:39 +04:00
ad3988e5fc добавлено видео 2024-09-25 10:58:41 +04:00
780b4b2924 add readme and fix 2024-09-25 10:51:07 +04:00
d9f5f75f5e Добавление комментариев и readme 2024-09-24 21:07:33 +04:00
7d9c9ec4d0 Осталось readme сделать 2024-09-24 18:13:22 +04:00
5047b16cde files 2024-09-24 16:56:39 +04:00
2b87427299 что-то есть 2024-09-24 16:55:37 +04:00
JulYakJul
21cdd4971d fix link 2024-09-24 14:53:05 +04:00
6b55b7b0fc Merge pull request 'minhasapov_ruslan_lab_1' (#23) from minhasapov_ruslan_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#23
2024-09-24 13:43:10 +04:00
47193155d9 Merge pull request 'kashin_maxim_lab_1' (#22) from kashin_maxim_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#22
2024-09-24 13:21:02 +04:00
bc8c4c887e Merge pull request 'zhimolostnova_anna_lab_2' (#21) from zhimolostnova_anna_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#21
2024-09-24 13:17:26 +04:00
4a2adcc35a Merge pull request 'yakovleva_yulia_lab_1' (#19) from yakovleva_yulia_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#19
2024-09-24 11:59:06 +04:00
d7cb666a0d Merge pull request 'kuzarin_maxim_lab_3' (#17) from kuzarin_maxim_lab_3 into main
Reviewed-on: Alexey/DAS_2024_1#17
2024-09-24 11:58:22 +04:00
6c642384c1 Merge pull request 'zhimolostnova_anna_lab_1' (#16) from zhimolostnova_anna_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#16
2024-09-24 11:52:56 +04:00
bdb5cc07ed Обновить dolgov_dmitriy_lab_1/README.md 2024-09-24 01:30:02 +04:00
e761e33201 Обновить dolgov_dmitriy_lab_1/README.md 2024-09-24 01:28:51 +04:00
Аришина)
ceee500b95 ЛР 1 готова 2024-09-24 01:20:27 +04:00
4c74a16753 туториал 3 2024-09-23 22:45:22 +04:00
a830cb2198 туториал 2 2024-09-23 22:42:39 +04:00
9d0fa199f7 first work 2024-09-23 21:35:24 +04:00
2be2c71b69 перенос 2024-09-23 20:19:10 +04:00
JulYakJul
aa8180ba49 Merge branch 'main' into yakovleva_yulia_lab_2 2024-09-23 17:34:56 +04:00
c509e74465 Merge pull request 'balakhonov_danila_lab_1' (#15) from balakhonov_danila_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#15
2024-09-23 16:55:14 +04:00
314751f25c Merge pull request 'tukaeva_alfiya_lab_1 is ready' (#14) from tukaeva_alfiya_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#14
2024-09-23 16:54:53 +04:00
48f7f3a215 Merge pull request 'polevoy_sergey_lab_1' (#13) from polevoy_sergey_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#13
2024-09-23 16:54:09 +04:00
f112d2a44b Merge pull request 'mochalov_danila_lab_1' (#12) from mochalov_danila_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#12
2024-09-23 16:53:36 +04:00
477afb824d Merge pull request 'dozorova_alena_lab_2' (#11) from dozorova_alena_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#11
2024-09-23 16:53:14 +04:00
6ce78e60ad Починил генерацию файлов. 2024-09-23 16:00:41 +04:00
b13182c80e Проблема с Docker на компьютере. Сохранение процесса. 2024-09-23 15:28:34 +04:00
e7b9938278 Merge pull request 'emelyanov_artem_lab_2' (#10) from emelyanov_artem_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#10
2024-09-23 13:45:07 +04:00
822467bd99 add branch + readme 2024-09-23 13:20:26 +04:00
JulYakJul
ba7480cb4f fix 2024-09-23 13:10:28 +04:00
520337f92d borschevskaya_anna_lab_2 is ready 2024-09-23 08:40:17 +04:00
6de5160da9 ЛР 5 готова. 2024-09-22 22:14:00 +03:00
d7faf2a1b7 Обновить kuzarin_maxim_lab_4/README.md
Вот теперь картинки будут отображаться
2024-09-22 20:23:01 +04:00
d98803227e Обновить kuzarin_maxim_lab_4/README.md 2024-09-22 20:22:20 +04:00
6f12270c73 Обновить kuzarin_maxim_lab_4/README.md 2024-09-22 20:21:18 +04:00
6e6266c228 ЛР 4 готова, но может быть нужно будет README поправить... 2024-09-22 19:19:44 +03:00
06d1d8cdd4 lab1 2024-09-22 18:06:51 +04:00
4c76a9dea6 minhasapov_ruslan_lab_1 is ready 2024-09-21 22:14:08 +04:00
e5d0aa0b3d Выполнено 2024-09-21 16:19:03 +04:00
d326e64f24 fix readme again 2024-09-21 16:15:48 +04:00
1a118ae71f fix readme 2024-09-21 16:13:24 +04:00
e9b06b1f27 complete lab 2 2024-09-21 16:11:07 +04:00
JulYakJul
1adaac9281 yakovleva_yulia_lab_2 is ready 2024-09-20 18:36:39 +04:00
f0083bc4cd feature: add .yml and .env files, start readme 2024-09-20 01:42:54 +04:00
JulYakJul
5e9e2600f3 yakovleva_yulia_lab_1 is ready 2024-09-19 16:14:05 +04:00
b6e311755e add branch + readme 2024-09-19 15:54:13 +04:00
JulYakJul
0c3e973307 Revert "Merge pull request 'yakovleva_julia_lab_1' (#9) from yakovleva_julia_lab_1 into main"
This reverts commit c474c13c4a, reversing
changes made to 829a04a913.
2024-09-19 15:50:52 +04:00
c474c13c4a Merge pull request 'yakovleva_julia_lab_1' (#9) from yakovleva_julia_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#9
2024-09-19 15:42:48 +04:00
8eedde24a1 ЛР 3 готова. Нужно проверить пару моментов, но в целом всё должно быть нормально 2024-09-19 10:53:49 +03:00
829a04a913 Merge pull request 'Kuzarin_maxim_lab_2' (#8) from kuzarin_maxim_lab_2 into main
Reviewed-on: Alexey/DAS_2024_1#8
2024-09-19 11:19:58 +04:00
57970b3333 fix readme 2024-09-19 02:08:16 +04:00
1c77ba3272 fix readme 2024-09-19 02:05:34 +04:00
ce9527b1c9 fix comments 2024-09-19 02:02:41 +04:00
a1419f21ec changes readme 2024-09-19 02:00:03 +04:00
aac01e9f48 complete lab 1 2024-09-19 01:56:40 +04:00
221f3e248b Лабораторная работа номер 1 выполнена 2024-09-18 23:53:53 +04:00
3d98388a13 tukaeva_alfiya_lab_1 is ready 2024-09-18 23:09:14 +04:00
4922e9075e polevoy_sergey_lab_1_completed 2024-09-18 19:01:17 +04:00
891eae4211 mochalov_danila_lab_1 is ready 2024-09-18 17:02:04 +04:00
121e4bbcd2 dozorova_alena_lab_2 2024-09-17 22:46:46 +04:00
0590f7b532 feature: add README.md 2024-09-17 22:26:19 +04:00
0eec58a347 feature: completed lab 2 2024-09-17 22:07:57 +04:00
JulYakJul
c8dbd5fb37 yakovleva_julia_lab_1 is ready 2024-09-17 17:43:15 +04:00
253ad80e31 Работа готова. Нужнго проверить Readme, но вроде норм 2024-09-17 14:13:30 +03:00
f980a74f5e Merge pull request 'emelyanov_artem_lab_1' (#7) from emelyanov_artem_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#7
2024-09-17 14:55:50 +04:00
e10ae36577 Merge pull request 'borschevskaya_anna_lab_1' (#6) from borschevskaya_anna_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#6
2024-09-17 14:54:05 +04:00
46b8ecfc54 Merge pull request 'vaksman_valerya_lab_1' (#5) from vaksman_valerya_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#5
2024-09-17 14:52:42 +04:00
262193a301 Merge pull request 'tsukanova_irina_lab_1' (#3) from tsukanova_irina_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#3
2024-09-17 14:36:47 +04:00
48711e14e3 Merge pull request 'kuzarin_maxim_lab_1' (#2) from kuzarin_maxim_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#2
2024-09-17 14:11:50 +04:00
39664ac3a1 Merge pull request 'dozorova_alena_lab_1 is ready' (#1) from dozorova_alena_lab_1 into main
Reviewed-on: Alexey/DAS_2024_1#1
2024-09-17 14:10:20 +04:00
7af877c37a feature: completed lab 1 2024-09-17 13:08:06 +04:00
7d2ae7430d is super duper ready 2024-09-15 21:59:03 +04:00
ec21e89033 borschevskaya_anna_lab_1 is ready 2024-09-15 21:51:58 +04:00
afddfcf91f is super ready 2024-09-15 21:36:13 +04:00
9b0cb3582d ready 2024-09-15 21:34:49 +04:00
37080832d5 is ready 2024-09-15 21:23:41 +04:00
39fdc511ee Init commit. 2024-09-15 19:23:41 +04:00
2714d4e718 tsukanova_irina_lab_1 is ready 2024-09-15 16:18:03 +04:00
4af4abcb7f dozorova_alena_lab_1 is ready 2024-09-13 23:02:10 +04:00
2842 changed files with 342793 additions and 0 deletions

6
.idea/.gitignore vendored Normal file
View File

@ -0,0 +1,6 @@
# Default ignored files
/shelf/
/workspace.xml
/DAS_2024_1.iml
/modules.xml
/vcs.xml

1
.idea/.name Normal file
View File

@ -0,0 +1 @@
main.py

View File

@ -0,0 +1,12 @@
<component name="InspectionProjectProfileManager">
<profile version="1.0">
<option name="myName" value="Project Default" />
<inspection_tool class="PyUnresolvedReferencesInspection" enabled="true" level="WARNING" enabled_by_default="true">
<option name="ignoredIdentifiers">
<list>
<option value="str.__pos__" />
</list>
</option>
</inspection_tool>
</profile>
</component>

View File

@ -0,0 +1,6 @@
<component name="InspectionProjectProfileManager">
<settings>
<option name="USE_PROJECT_PROFILE" value="false" />
<version value="1.0" />
</settings>
</component>

4
.idea/misc.xml Normal file
View File

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectRootManager" version="2" project-jdk-name="Python 3.9 (tukaeva_alfiya_lab_4)" project-jdk-type="Python SDK" />
</project>

View File

@ -0,0 +1,32 @@
# Лабораторная работа 1 - Знакомство с Docker и Docker Compose
## ПИбд-42 || Алейкин Артем
### Описание
В данной лабораторной работе мы разворачиваем три популярных сервиса — MediaWiki и Redmine — с использованием Docker Compose. Каждый сервис работает в своем контейнере и использует общую базу данных PostgreSQL для хранения данных. Мы также настраиваем проброс портов для доступа к веб-интерфейсам сервисов и используем Docker volumes для сохранения данных вне контейнеров.
### Цель проекта
изучение современных технологий контейнеризации
### Шаги для запуска:
1. Клонирование репозитория:
```
git clone <ссылка-на-репозиторий>
cd <папка репозитория>
```
2. Запуск контейнеров:
```
docker-compose up -d
```
3. После запуска должны быть доступны следующие контейнеры:
MediaWiki: http://localhost:8080
Redmine: http://localhost:8081
4. Чтобы остановить контейнеры:
```
docker-compose down
```
Видео демонстрации работы: https://vk.com/video248424990_456239601?list=ln-sCRa9IIiV1VpInn2d1

View File

@ -0,0 +1,45 @@
services:
mediawiki:
image: mediawiki
container_name: mediawiki
ports:
- "8080:80" # Пробрасываем порт 8080 на хост для доступа к MediaWiki
volumes:
- mediawiki_data:/var/www/html/images # Создаем volume для хранения данных MediaWiki
environment:
- MEDIAWIKI_DB_HOST=db
- MEDIAWIKI_DB_NAME=mediawiki
- MEDIAWIKI_DB_USER=root
- MEDIAWIKI_DB_PASSWORD=example
depends_on:
- db
redmine:
image: redmine
container_name: redmine
ports:
- "8081:3000" # Пробрасываем порт 8081 на хост для доступа к Redmine
volumes:
- redmine_data:/usr/src/redmine/files # Создаем volume для хранения данных Redmine
environment:
- REDMINE_DB_POSTGRESQL=db
- REDMINE_DB_DATABASE=redmine
- REDMINE_DB_USERNAME=root
- REDMINE_DB_PASSWORD=example
depends_on:
- db
db:
image: postgres:latest
container_name: db
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: example
POSTGRES_DB: postgres
volumes:
- db_data:/var/lib/postgresql # Volume для базы данных
volumes:
mediawiki_data: # volume для MediaWiki
redmine_data: # volume для Redmine
db_data: # volume для базы данных

View File

@ -0,0 +1,48 @@
## Отчет по Docker Compose конфигурации
### Краткое описание:
Данная конфигурация Docker Compose запускает набор сервисов, необходимых для работы WordPress и MediaWiki. Она включает в себя:
- **WordPress:** веб-сервис для блогов и CMS
- **MySQL:** база данных для хранения данных WordPress
- **RabbitMQ:** брокер сообщений для потенциального использования в будущем
- **MediaWiki:** вики-движок для создания и редактирования вики-страниц
### Запуск лабораторной работы:
1. Установить Docker и Docker Compose.
2. Сохранить конфигурацию в файл docker-compose.yml.
3. Запустить команду docker-compose up --build
### Используемые технологии:
- **Docker Compose:** инструмент для определения и запуска многоконтейнерных приложений.
- **Docker:** платформа для создания, развертывания и запуска контейнеров.
- **WordPress:** популярная платформа для создания блогов и CMS.
- **MySQL:** популярная система управления базами данных.
- **RabbitMQ:** брокер сообщений, используемый для асинхронного обмена сообщениями.
- **MediaWiki:** свободное программное обеспечение для создания и редактирования вики-страниц.
### Функциональность:
Конфигурация запускает следующие сервисы:
- **WordPress:** работает на порту 8080, доступен по адресу http://localhost:8080.
- **MySQL:** предоставляет базу данных для WordPress и MediaWiki.
- **RabbitMQ:** работает на порту 5672, доступен по адресу http://localhost:15672 для управления.
- **MediaWiki:** работает на порту 8081, доступен по адресу http://localhost:8081.
### Дополнительные сведения
- **Volumes**: используются для хранения данных сервисов, чтобы они не терялись при перезапуске контейнеров.
- **Depends_on**: указывает на зависимость между сервисами, например, WordPress зависит от MySQL.
- **Restart policy**: определяет, как сервисы будут перезапускаться после сбоя.
### Видео
https://vk.com/video/@artamonovat?z=video212084908_456239356%2Fpl_212084908_-2
### Заключение:
Данная конфигурация Docker Compose обеспечивает простой и удобный способ запуска и управления несколькими сервисами, связанными с WordPress и MediaWiki. Она позволяет разработчикам легко развертывать и управлять приложениями в изолированной среде.

View File

@ -0,0 +1,61 @@
version: '3.7'
services:
wordpress:
image: wordpress:latest
ports:
- "8080:80"
volumes:
- wordpress_data:/var/www/html
environment:
WORDPRESS_DB_HOST: db
WORDPRESS_DB_NAME: wordpress
WORDPRESS_DB_USER: wordpress
WORDPRESS_DB_PASSWORD: password
depends_on:
- db
restart: unless-stopped
db:
image: mysql:latest
volumes:
- db_data:/var/lib/mysql
environment:
MYSQL_DATABASE: wordpress
MYSQL_USER: wordpress
MYSQL_PASSWORD: dbpassword
MYSQL_ROOT_PASSWORD: rootpassword
restart: unless-stopped
rabbitmq:
image: rabbitmq:3-management
ports:
- "5672:5672"
- "15672:15672"
volumes:
- rabbitmq_data:/var/lib/rabbitmq
environment:
RABBITMQ_DEFAULT_USER: guest
RABBITMQ_DEFAULT_PASS: password
restart: unless-stopped
mediawiki:
image: mediawiki:latest
ports:
- "8081:80"
volumes:
- mediawiki_data:/var/www/html
environment:
MW_DB_SERVER: db
MW_DB_NAME: mediawiki
MW_DB_USER: mediawiki
MW_DB_PASSWORD: mediawiki_password
depends_on:
- db
restart: unless-stopped
volumes:
wordpress_data:
db_data:
rabbitmq_data:
mediawiki_data:

5
artamonova_tatyana_lab_2/.gitignore vendored Normal file
View File

@ -0,0 +1,5 @@
*.pyc
__pycache__
*.egg-info
*.dist-info
.DS_Store

View File

@ -0,0 +1,22 @@
## Лабораторная работа №2
### Выполнила Артамонова Татьяна ПИбд-42
**Вариант 1: Программа 4 - Количество символов в именах файлов из каталога /var/data**
- Формирует файл /var/result/data1.txt так, что каждая строка файла - количество символов в именах файлов из каталога /var/data.
**Вариант 2: Программа 3 - Количество чисел в последовательности**
- Ищет набольшее число из файла /var/result/data1.txt и сохраняет количество таких чисел из последовательности в /var/result/data2.txt.
**Структура проекта:**
1. В папках worker-1, worker-2 лежат выполняемые файлы .py и Dockerfile-ы с необходимым набором инструкций.
2. В папке data лежат файлы, длину имен которых нужно посчитать.
3. В папке result лежат файлы с результатами выполнения программ. data1.txt - результат выполнения main1.py (worker-1), data2.txt - результат выполнения main2.py (worker-2). Данные в data2 рассчитываются из данных data1.
4. Файл .gitignore - для указания, какие файлы отслеживать, а какие - нет.
5. docker-compose.yml - для определения и управления контейнерами Docker.
**Команда для запуска** - docker-compose up --build
**Ссылка на видео:** https://vk.com/artamonovat?z=video212084908_456239357%2Fvideos212084908%2Fpl_212084908_-2

View File

@ -0,0 +1,22 @@
services:
worker-1:
build:
context: ./worker-1
volumes:
- ./worker-1:/app
- ./data:/var/data
- ./result:/var/result
depends_on:
- worker-2
worker-2:
build:
context: ./worker-2
volumes:
- ./worker-2:/app
- ./data:/var/data
- ./result:/var/result
volumes:
data:
result:

View File

@ -0,0 +1,3 @@
15
18
18

View File

@ -0,0 +1 @@
2

View File

@ -0,0 +1,14 @@
# Используем образ Python 3.10-slim как основу для нашего контейнера.
# slim-версия образа более компактная, что делает контейнер меньше.
FROM python:3.10-slim
# Устанавливаем рабочую директорию в контейнере как /app.
# Все последующие команды будут выполняться в этой директории.
WORKDIR /app
# Копируем файл main1.py из текущей директории в директорию /app в контейнере.
COPY main1.py .
# Определяем команду, которая будет выполняться при запуске контейнера.
# В данном случае запускается Python-скрипт main1.py.
CMD ["python", "main1.py"]

View File

@ -0,0 +1,21 @@
import os
import glob
# Формирует файл data1.txt так, что каждая строка файла - кол-во символов в именах файла из каталога /data
def main():
data_dir = "/var/data"
result_file = "/var/result/data1.txt"
result_dir = os.path.dirname(result_file)
if not os.path.exists(result_dir):
os.makedirs(result_dir)
files = glob.glob(os.path.join(data_dir, '*'))
with open(result_file, 'w') as f:
for file in files:
filename = os.path.basename(file)
f.write(f"{len(filename)}\n")
if __name__ == "__main__":
main()

View File

@ -0,0 +1,14 @@
# Используем образ Python 3.10-slim как основу для нашего контейнера.
# slim-версия образа более компактная, что делает контейнер меньше.
FROM python:3.10-slim
# Устанавливаем рабочую директорию в контейнере как /app.
# Все последующие команды будут выполняться в этой директории.
WORKDIR /app
# Копируем файл main2.py из текущей директории в директорию /app в контейнере.
COPY main2.py .
# Определяем команду, которая будет выполняться при запуске контейнера.
# В данном случае запускается Python-скрипт main2.py.
CMD ["python", "main2.py"]

View File

@ -0,0 +1,26 @@
import os
# Ищет наибольшее число из файла data1.txt и сохраняет количество таких чисел из последовательности в data2.txt
def main():
data_file_path = "/var/result/data1.txt"
result_file_path = "/var/result/data2.txt"
if not os.path.exists(data_file_path):
data_dir = os.path.dirname(data_file_path)
if not os.path.exists(result_file_path):
result_dir = os.path.dirname(result_file_path)
with open(data_file_path, 'r') as f:
numbers = [int(x.strip()) for x in f.read().splitlines()]
max_number = max(numbers)
count = numbers.count(max_number)
with open(result_file_path, 'w') as f:
f.write(str(count))
print(f"Количество наибольших чисел: {count}")
if __name__ == "__main__":
main()

View File

@ -0,0 +1,59 @@
# Лабораторная работа номер 1
> Здравствуйте меня зовут Балахонов Данила группа ПИбд-42
>
> *— Балахонов Данила ПИбд-42*
Видео лабораторной работы номер 1 доступно по этой [ссылке](https://drive.google.com/file/d/1Up_JzDcK_TjYLixpfYXN7PhJmOeg_Uck/view?usp=sharing).
## Как запустить лабораторную работу номер 1?
### Необходимые компоненты для запуска лабораторной работы номер 1
> Здесь рассказана установка необходимых компонентов для запуска лабораторной работы номер 1 под дистрибутив GNU/Linux **Ubuntu**.
Для запуска лабораторной работы номер 1 необходимы такие компоненты:
- Git
- Docker
- Docker compose
Чтобы установить **Git**, необходимо ввести данные команды в командную строку:
``` bash
sudo apt-get update
sudo apt-get install git
```
Чтобы установить **Docker** и **Docker compose**, стоит ввести такие команды:
``` bash
# Настройка репозитория Docker
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
# Установка Docker и его компонентов
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
```
### Запуск лабораторной работы номер 1
Для запуска лабораторной работы номер 1 необходимо **склонировать** репозиторий в любую папку и **перейти на ветку** balakhonov_danila_lab_1.
Далее в папке с `docker-compose.yaml` нужно вызвать такую команду:
``` bash
sudo docker-compose up -d
```
Таким образом будут запущены контейнеры в фоновом режиме.
## Какие технологии были использованы?
Для выполнения лабораторной работы номер 1 использовались такие технологии, как: *git*, *docker*, *docker compose*.
Сервисы, выбранные для запуска в docker-compose файле:
- *Gitea* - удобный сервис отслеживания версий разрабатываемого ПО
- *MediaWiki* - сервис создания и ведения электронной энциклопедии
- *PostgreSQL* - база данных, используемая сервисами выше
Системой, на которую были установлены указанные технологии, является Ubuntu 22.
## Что делает лабораторная работа номер 1?
Лабораторная работа номер 1 заключается в написании docker-compose файла для удобного запуска и администрирования сразу нескольких сервисов в docker-контейнерах.

View File

@ -0,0 +1,58 @@
services:
# PostgreSQL
db:
# Образ контейнера PostgreSQL последней версии
image: postgres
# Название контейнера
container_name: db
# Переменные окружения для настройки базы данных
environment:
- POSTGRES_USER=gitea
- POSTGRES_PASSWORD=gitea
- POSTGRES_DB=gitea
# Настройка корневого каталога, где хранятся данные
# Слева указан каталог компьютера, справа - каталог контейнера
# Нужно для сохранения данных на сервере после отключения контейнера
volumes:
- ./postgres:/var/lib/postgresql/data
# Порт, через который можно будет подключиться к базе данных
ports:
- 5432:5432
# После перезапуска докера всегда запускать этот контейнер
restart: always
# Gitea
gitea:
# Используется Gitea последней версии
image: gitea/gitea
container_name: gitea
# После перезапуска докера всегда запускать этот контейнер
restart: always
volumes:
- ./data:/var/lib/gitea
- ./config:/etc/gitea
- /etc/timezone:/etc/timezone:ro
- /etc/localtime:/etc/localtime:ro
ports:
- 3000:3000
- 2222:2222
environment:
- GITEA__database__DB_TYPE=postgres
- GITEA__database__HOST=db:5432
- GITEA__database__NAME=gitea
- GITEA__database__USER=gitea
- GITEA__database__PASSWD=gitea
# Указывается, что этот контейнер запускается только после запуска контейнера db
depends_on:
- db
# MediaWiki
mediawiki:
# Образ контейнера MediaWiki последней версии
image: mediawiki
container_name: mediawiki
restart: always
ports:
- 8080:80
links:
- db
volumes:
- ./images:/var/www/html/images

View File

@ -0,0 +1,64 @@
# Лабораторная работа номер 2
> Здравствуйте меня зовут Балахонов Данила группа ПИбд-42
>
> *— Балахонов Данила ПИбд-42*
Видео лабораторной работы номер 2 доступно по этой [ссылке](https://drive.google.com/file/d/1N4NgWsFLlHY5lGOO3Ps7DPvdJbHNxaqz/view?usp=sharing).
## Как запустить лабораторную работу номер 2?
### Необходимые компоненты для запуска лабораторной работы номер 2
> Здесь рассказана установка необходимых компонентов для запуска лабораторной работы номер 2 под дистрибутив GNU/Linux **Ubuntu**.
Для запуска лабораторной работы номер 2 необходимы такие компоненты:
- Git
- Docker
- Docker compose
Чтобы установить **Git**, необходимо ввести данные команды в командную строку:
``` bash
sudo apt-get update
sudo apt-get install git
```
Чтобы установить **Docker** и **Docker compose**, стоит ввести такие команды:
``` bash
# Настройка репозитория Docker
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
# Установка Docker и его компонентов
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
```
### Запуск лабораторной работы номер 2
Для запуска лабораторной работы номер 2 необходимо **склонировать** репозиторий в любую папку и **перейти на ветку** balakhonov_danila_lab_2.
Далее в папке с `docker-compose.yaml` нужно вызвать такую команду:
``` bash
sudo docker-compose up --build
```
Таким образом контейнеры будут подготовлены и запущены. Результат выполнения будет находится внутри директории докера. Расположение файлов data.txt и result.txt: `/var/lib/docker/volumes/balakhonov_danila_lab_2_result/_data/`
## Какие технологии были использованы?
Для выполнения лабораторной работы номер 2 были применены такие технологии, как:
- Dockerfile
- Docker compose
- Git
- .NET SDK и F# в частности
Сервисы были написаны с использованием .NET SDK на языке F#.
## Что делает лабораторная работа номер 2?
Лабораторная работа номер 2 запускает два сервиса:
1. Сервис, который берёт из каталога `/var/data` случайный файл и перекладывает его в `/var/result/data.txt`
2. Сервис, который ищет наибольшее число из файла `/var/result/data.txt` и сохраняет количество таких чисел из последовательности в `/var/result/result.txt`
Благодаря лабораторной работе номер 2 были получены навыки создания Dockerfile для развертывания проектов в контейнерах, а также их связки с помощью docker-compose.yaml.

View File

@ -0,0 +1,22 @@
services:
app1:
build: ./sigma_app_1/
volumes:
# Создание папки /var/data внутри контейнера
# И копирование файлов из ./files в эту папку
- ./files:/var/data
# Создание папки /var/result внутри контейнера
# А также папки result внутри директории докера
- result:/var/result
app2:
build: ./skibidi_app_2/
# Указано, что пока не запуститься app1, app2 не запустится
# Он ЗАВИСИТ от app1 (depends on (с англ.) - зависит от)
depends_on:
- app1
volumes:
- result:/var/result
volumes:
# Указывается, что будет создана папка result
# внутри директории докера
result:

View File

@ -0,0 +1,323 @@
245
678
12
987
456
234
789
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
987
654
321
456
789
12
34
56
78
90
123
456
789
234
567
890
123
456
789
987
654
321
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678
123
456
789
234
567
890
12
34
56
78
90
123
456
789
321
654
987
432
876
543
210
678
345
678

View File

@ -0,0 +1,642 @@
873
62
455
879
235
941
267
811
174
517
382
399
460
221
640
915
384
622
897
212
798
109
477
546
29
995
678
342
135
804
890
453
726
891
664
290
872
190
526
304
12
587
234
753
980
197
824
579
458
15
999
614
704
205
860
537
842
491
668
210
920
477
811
350
731
95
639
287
127
423
1000
394
521
8
267
154
431
715
266
834
173
268
947
582
157
367
882
737
305
472
481
651
960
843
701
122
514
92
658
884
371
458
637
620
793
285
611
785
495
822
849
708
592
465
469
78
734
667
606
241
666
474
569
543
918
68
906
123
501
330
947
111
365
734
249
429
296
16
511
974
317
764
230
542
920
821
718
281
556
575
900
632
720
462
88
275
403
100
418
684
600
119
863
781
225
971
670
80
643
220
176
588
58
202
850
537
934
748
378
817
505
696
21
630
324
117
420
257
493
826
688
305
772
654
927
208
525
511
256
650
447
163
99
74
99
487
306
754
510
132
201
392
785
778
512
258
904
932
589
694
204
884
110
673
152
649
295
387
758
927
538
619
904
651
174
712
104
641
474
198
322
764
204
407
550
42
879
716
368
316
43
600
893
370
137
631
244
571
663
551
907
211
166
746
583
708
771
215
90
829
653
494
563
334
794
745
936
718
126
923
451
668
966
532
935
886
646
75
858
693
859
284
315
679
133
878
292
340
716
128
250
554
482
789
677
308
494
931
144
337
982
713
535
893
939
932
905
805
236
991
781
686
572
951
335
58
303
335
145
608
794
862
792
619
54
292
878
585
293
959
379
20
484
144
678
67
363
946
566
106
442
820
562
109
201
759
481
289
698
25
847
648
733
613
776
989
257
864
32
703
989
465
103
963
515
829
30
303
926
159
586
268
852
953
321
306
978
909
177
835
458
994
885
213
775
385
598
267
754
448
1000
555
354
657
231
979
265
374
68
197
953
648
153
523
761
827
819
63
782
766
882
404
258
672
883
80
111
212
681
812
911
837
194
161
143
427
981
132
357
605
810
414
20
210
772
882
313
186
578
154
523
339
383
903
29
172
62
314
491
289
550
521
327
794
299
678
769
415
266
77
33
438
233
160
11
523
623
254
29
327
924
938
588
444
976
547
775
638
35
23
203
203
927
149
198
150
370
728
775
818
768
99
40
969
435
49
276
360
964
277
283
825
479
331
471
381
652
264
564
891
638
470
291
101
143
93
663
328
841
881
94
327
2
628
474
905
545
421
453
282
276
24
655
295
48
102
49
676
187
773
169
170
165
405
348
4
654
276
343
153
381
756
753
816
474
186
652
67
689
69
920
880
363
637
524
171
753
12
634
648
668
220
408
348
887
341
738
681
408
377
693
234
83
982
417
222
322
253
494
868
951
344
60
23
41
99
944
723
156
813
5
44
62
899
835
482
469
157
637
295
929
992
234
66
31
170
333
92
185
117
627
82
292
796
840
768
532
981
300
125
958
4

View File

@ -0,0 +1,489 @@
522
173
815
671
284
903
477
639
732
143
928
564
812
109
397
249
868
301
848
376
794
99
506
217
645
12
187
930
811
583
684
455
94
499
118
722
603
267
772
947
845
210
495
632
372
930
908
546
327
685
883
235
613
579
762
491
328
672
156
739
1000
421
731
215
867
610
847
732
204
411
515
150
438
651
174
590
725
963
530
889
577
694
417
261
767
480
934
125
558
282
899
96
653
908
303
774
617
407
482
538
239
472
766
118
920
206
797
420
853
205
340
123
387
497
640
24
999
476
77
920
382
405
55
834
371
167
290
300
611
53
470
81
232
14
451
678
623
564
787
99
648
873
803
888
504
186
256
405
102
999
673
721
434
814
305
582
436
90
774
216
706
855
702
307
59
835
812
234
736
168
523
219
868
365
294
500
207
927
450
521
851
703
992
327
916
554
846
658
88
659
628
764
84
45
10
870
779
320
882
942
93
792
836
137
489
862
391
337
887
114
237
178
874
569
135
919
931
231
50
995
215
658
139
484
292
903
113
755
333
829
942
360
172
689
42
127
799
191
455
533
234
15
404
636
373
884
921
977
113
227
703
173
297
440
604
575
971
855
82
252
589
276
826
206
166
482
375
174
612
818
854
832
809
569
306
993
931
289
148
943
421
784
441
536
426
548
49
687
415
505
951
583
368
172
974
47
173
570
264
754
701
693
796
914
809
310
512
725
963
829
614
220
410
631
860
270
158
168
595
62
715
913
517
157
5
660
274
414
139
300
698
675
263
872
292
142
375
696
895
302
75
576
899
524
362
721
916
883
347
980
29
392
839
971
593
708
804
678
234
719
659
418
914
437
550
418
576
776
293
737
348
292
48
975
547
205
831
783
587
657
132
733
53
700
785
292
332
771
849
994
905
460
420
923
663
134
658
673
618
779
951
244
425
312
436
878
538
236
805
457
897
799
134
469
56
724
370
521
654
20
260
315
525
501
433
90
368
192
162
198
65
652
613
222
160
76
755
541
305
257
669
179
849
878
249
224
4
1
860
967
738
712
281
834
908
774
964
880
902
234
635
138
305
532
585
956
68
21
278
639
622
473
769
161
580
285
204
410
115
430
953
968
593
703
704
469
835
623
991

View File

@ -0,0 +1,4 @@
bin/
obj/
Dockerfile
README.md

View File

@ -0,0 +1,484 @@
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.
##
## Get latest from `dotnet new gitignore`
# dotenv files
.env
# User-specific files
*.rsuser
*.suo
*.user
*.userosscache
*.sln.docstates
# User-specific files (MonoDevelop/Xamarin Studio)
*.userprefs
# Mono auto generated files
mono_crash.*
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
[Ww][Ii][Nn]32/
[Aa][Rr][Mm]/
[Aa][Rr][Mm]64/
bld/
[Bb]in/
[Oo]bj/
[Ll]og/
[Ll]ogs/
# Visual Studio 2015/2017 cache/options directory
.vs/
# Uncomment if you have tasks that create the project's static files in wwwroot
#wwwroot/
# Visual Studio 2017 auto generated files
Generated\ Files/
# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*
# NUnit
*.VisualState.xml
TestResult.xml
nunit-*.xml
# Build Results of an ATL Project
[Dd]ebugPS/
[Rr]eleasePS/
dlldata.c
# Benchmark Results
BenchmarkDotNet.Artifacts/
# .NET
project.lock.json
project.fragment.lock.json
artifacts/
# Tye
.tye/
# ASP.NET Scaffolding
ScaffoldingReadMe.txt
# StyleCop
StyleCopReport.xml
# Files built by Visual Studio
*_i.c
*_p.c
*_h.h
*.ilk
*.meta
*.obj
*.iobj
*.pch
*.pdb
*.ipdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*_wpftmp.csproj
*.log
*.tlog
*.vspscc
*.vssscc
.builds
*.pidb
*.svclog
*.scc
# Chutzpah Test files
_Chutzpah*
# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opendb
*.opensdf
*.sdf
*.cachefile
*.VC.db
*.VC.VC.opendb
# Visual Studio profiler
*.psess
*.vsp
*.vspx
*.sap
# Visual Studio Trace Files
*.e2e
# TFS 2012 Local Workspace
$tf/
# Guidance Automation Toolkit
*.gpState
# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper
*.DotSettings.user
# TeamCity is a build add-in
_TeamCity*
# DotCover is a Code Coverage Tool
*.dotCover
# AxoCover is a Code Coverage Tool
.axoCover/*
!.axoCover/settings.json
# Coverlet is a free, cross platform Code Coverage Tool
coverage*.json
coverage*.xml
coverage*.info
# Visual Studio code coverage results
*.coverage
*.coveragexml
# NCrunch
_NCrunch_*
.*crunch*.local.xml
nCrunchTemp_*
# MightyMoose
*.mm.*
AutoTest.Net/
# Web workbench (sass)
.sass-cache/
# Installshield output folder
[Ee]xpress/
# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html
# Click-Once directory
publish/
# Publish Web Output
*.[Pp]ublish.xml
*.azurePubxml
# Note: Comment the next line if you want to checkin your web deploy settings,
# but database connection strings (with potential passwords) will be unencrypted
*.pubxml
*.publishproj
# Microsoft Azure Web App publish settings. Comment the next line if you want to
# checkin your Azure Web App publish settings, but sensitive information contained
# in these scripts will be unencrypted
PublishScripts/
# NuGet Packages
*.nupkg
# NuGet Symbol Packages
*.snupkg
# The packages folder can be ignored because of Package Restore
**/[Pp]ackages/*
# except build/, which is used as an MSBuild target.
!**/[Pp]ackages/build/
# Uncomment if necessary however generally it will be regenerated when needed
#!**/[Pp]ackages/repositories.config
# NuGet v3's project.json files produces more ignorable files
*.nuget.props
*.nuget.targets
# Microsoft Azure Build Output
csx/
*.build.csdef
# Microsoft Azure Emulator
ecf/
rcf/
# Windows Store app package directories and files
AppPackages/
BundleArtifacts/
Package.StoreAssociation.xml
_pkginfo.txt
*.appx
*.appxbundle
*.appxupload
# Visual Studio cache files
# files ending in .cache can be ignored
*.[Cc]ache
# but keep track of directories ending in .cache
!?*.[Cc]ache/
# Others
ClientBin/
~$*
*~
*.dbmdl
*.dbproj.schemaview
*.jfm
*.pfx
*.publishsettings
orleans.codegen.cs
# Including strong name files can present a security risk
# (https://github.com/github/gitignore/pull/2483#issue-259490424)
#*.snk
# Since there are multiple workflows, uncomment next line to ignore bower_components
# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
#bower_components/
# RIA/Silverlight projects
Generated_Code/
# Backup & report files from converting an old project file
# to a newer Visual Studio version. Backup files are not needed,
# because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm
ServiceFabricBackup/
*.rptproj.bak
# SQL Server files
*.mdf
*.ldf
*.ndf
# Business Intelligence projects
*.rdl.data
*.bim.layout
*.bim_*.settings
*.rptproj.rsuser
*- [Bb]ackup.rdl
*- [Bb]ackup ([0-9]).rdl
*- [Bb]ackup ([0-9][0-9]).rdl
# Microsoft Fakes
FakesAssemblies/
# GhostDoc plugin setting file
*.GhostDoc.xml
# Node.js Tools for Visual Studio
.ntvs_analysis.dat
node_modules/
# Visual Studio 6 build log
*.plg
# Visual Studio 6 workspace options file
*.opt
# Visual Studio 6 auto-generated workspace file (contains which files were open etc.)
*.vbw
# Visual Studio 6 auto-generated project file (contains which files were open etc.)
*.vbp
# Visual Studio 6 workspace and project file (working project files containing files to include in project)
*.dsw
*.dsp
# Visual Studio 6 technical files
*.ncb
*.aps
# Visual Studio LightSwitch build output
**/*.HTMLClient/GeneratedArtifacts
**/*.DesktopClient/GeneratedArtifacts
**/*.DesktopClient/ModelManifest.xml
**/*.Server/GeneratedArtifacts
**/*.Server/ModelManifest.xml
_Pvt_Extensions
# Paket dependency manager
.paket/paket.exe
paket-files/
# FAKE - F# Make
.fake/
# CodeRush personal settings
.cr/personal
# Python Tools for Visual Studio (PTVS)
__pycache__/
*.pyc
# Cake - Uncomment if you are using it
# tools/**
# !tools/packages.config
# Tabs Studio
*.tss
# Telerik's JustMock configuration file
*.jmconfig
# BizTalk build output
*.btp.cs
*.btm.cs
*.odx.cs
*.xsd.cs
# OpenCover UI analysis results
OpenCover/
# Azure Stream Analytics local run output
ASALocalRun/
# MSBuild Binary and Structured Log
*.binlog
# NVidia Nsight GPU debugger configuration file
*.nvuser
# MFractors (Xamarin productivity tool) working folder
.mfractor/
# Local History for Visual Studio
.localhistory/
# Visual Studio History (VSHistory) files
.vshistory/
# BeatPulse healthcheck temp database
healthchecksdb
# Backup folder for Package Reference Convert tool in Visual Studio 2017
MigrationBackup/
# Ionide (cross platform F# VS Code tools) working folder
.ionide/
# Fody - auto-generated XML schema
FodyWeavers.xsd
# VS Code files for those working on multiple tools
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
*.code-workspace
# Local History for Visual Studio Code
.history/
# Windows Installer files from build outputs
*.cab
*.msi
*.msix
*.msm
*.msp
# JetBrains Rider
*.sln.iml
.idea
##
## Visual studio for Mac
##
# globs
Makefile.in
*.userprefs
*.usertasks
config.make
config.status
aclocal.m4
install-sh
autom4te.cache/
*.tar.gz
tarballs/
test-results/
# Mac bundle stuff
*.dmg
*.app
# content below from: https://github.com/github/gitignore/blob/master/Global/macOS.gitignore
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
# content below from: https://github.com/github/gitignore/blob/master/Global/Windows.gitignore
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
[Dd]esktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msix
*.msm
*.msp
# Windows shortcuts
*.lnk
# Vim temporary swap files
*.swp

View File

@ -0,0 +1,14 @@
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /App
# Copy everything
COPY . ./
# Restore as distinct layers
RUN dotnet restore
# Build and publish a release
RUN dotnet publish -c Release -o out
FROM mcr.microsoft.com/dotnet/runtime:8.0 AS runtime
WORKDIR /App
COPY --from=build /App/out .
ENTRYPOINT ["dotnet", "sigma_app_1.dll"]

View File

@ -0,0 +1,14 @@
let PATH = @"/var/data/"
let RESULT_PATH = @"/var/result/data.txt"
let getFiles(path: string): seq<string> =
System.IO.Directory.EnumerateFiles(path)
let getRandFile(files: seq<string>) =
let rand = System.Random()
let index = rand.Next(Seq.length files)
Seq.item index files
let files = getFiles(PATH)
let randFile = getRandFile(files)
System.IO.File.Copy(randFile, RESULT_PATH)

View File

@ -0,0 +1,4 @@
# Первая программа лабораторной работы номер 2
> Вариант 6
>
> Берёт из каталога `/var/data` случайный файл и перекладывает его в `/var/result/data.txt`

View File

@ -0,0 +1,12 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<Compile Include="Program.fs" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,4 @@
bin/
obj/
Dockerfile
README.md

View File

@ -0,0 +1,484 @@
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.
##
## Get latest from `dotnet new gitignore`
# dotenv files
.env
# User-specific files
*.rsuser
*.suo
*.user
*.userosscache
*.sln.docstates
# User-specific files (MonoDevelop/Xamarin Studio)
*.userprefs
# Mono auto generated files
mono_crash.*
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
[Ww][Ii][Nn]32/
[Aa][Rr][Mm]/
[Aa][Rr][Mm]64/
bld/
[Bb]in/
[Oo]bj/
[Ll]og/
[Ll]ogs/
# Visual Studio 2015/2017 cache/options directory
.vs/
# Uncomment if you have tasks that create the project's static files in wwwroot
#wwwroot/
# Visual Studio 2017 auto generated files
Generated\ Files/
# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*
# NUnit
*.VisualState.xml
TestResult.xml
nunit-*.xml
# Build Results of an ATL Project
[Dd]ebugPS/
[Rr]eleasePS/
dlldata.c
# Benchmark Results
BenchmarkDotNet.Artifacts/
# .NET
project.lock.json
project.fragment.lock.json
artifacts/
# Tye
.tye/
# ASP.NET Scaffolding
ScaffoldingReadMe.txt
# StyleCop
StyleCopReport.xml
# Files built by Visual Studio
*_i.c
*_p.c
*_h.h
*.ilk
*.meta
*.obj
*.iobj
*.pch
*.pdb
*.ipdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*_wpftmp.csproj
*.log
*.tlog
*.vspscc
*.vssscc
.builds
*.pidb
*.svclog
*.scc
# Chutzpah Test files
_Chutzpah*
# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opendb
*.opensdf
*.sdf
*.cachefile
*.VC.db
*.VC.VC.opendb
# Visual Studio profiler
*.psess
*.vsp
*.vspx
*.sap
# Visual Studio Trace Files
*.e2e
# TFS 2012 Local Workspace
$tf/
# Guidance Automation Toolkit
*.gpState
# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper
*.DotSettings.user
# TeamCity is a build add-in
_TeamCity*
# DotCover is a Code Coverage Tool
*.dotCover
# AxoCover is a Code Coverage Tool
.axoCover/*
!.axoCover/settings.json
# Coverlet is a free, cross platform Code Coverage Tool
coverage*.json
coverage*.xml
coverage*.info
# Visual Studio code coverage results
*.coverage
*.coveragexml
# NCrunch
_NCrunch_*
.*crunch*.local.xml
nCrunchTemp_*
# MightyMoose
*.mm.*
AutoTest.Net/
# Web workbench (sass)
.sass-cache/
# Installshield output folder
[Ee]xpress/
# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html
# Click-Once directory
publish/
# Publish Web Output
*.[Pp]ublish.xml
*.azurePubxml
# Note: Comment the next line if you want to checkin your web deploy settings,
# but database connection strings (with potential passwords) will be unencrypted
*.pubxml
*.publishproj
# Microsoft Azure Web App publish settings. Comment the next line if you want to
# checkin your Azure Web App publish settings, but sensitive information contained
# in these scripts will be unencrypted
PublishScripts/
# NuGet Packages
*.nupkg
# NuGet Symbol Packages
*.snupkg
# The packages folder can be ignored because of Package Restore
**/[Pp]ackages/*
# except build/, which is used as an MSBuild target.
!**/[Pp]ackages/build/
# Uncomment if necessary however generally it will be regenerated when needed
#!**/[Pp]ackages/repositories.config
# NuGet v3's project.json files produces more ignorable files
*.nuget.props
*.nuget.targets
# Microsoft Azure Build Output
csx/
*.build.csdef
# Microsoft Azure Emulator
ecf/
rcf/
# Windows Store app package directories and files
AppPackages/
BundleArtifacts/
Package.StoreAssociation.xml
_pkginfo.txt
*.appx
*.appxbundle
*.appxupload
# Visual Studio cache files
# files ending in .cache can be ignored
*.[Cc]ache
# but keep track of directories ending in .cache
!?*.[Cc]ache/
# Others
ClientBin/
~$*
*~
*.dbmdl
*.dbproj.schemaview
*.jfm
*.pfx
*.publishsettings
orleans.codegen.cs
# Including strong name files can present a security risk
# (https://github.com/github/gitignore/pull/2483#issue-259490424)
#*.snk
# Since there are multiple workflows, uncomment next line to ignore bower_components
# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
#bower_components/
# RIA/Silverlight projects
Generated_Code/
# Backup & report files from converting an old project file
# to a newer Visual Studio version. Backup files are not needed,
# because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm
ServiceFabricBackup/
*.rptproj.bak
# SQL Server files
*.mdf
*.ldf
*.ndf
# Business Intelligence projects
*.rdl.data
*.bim.layout
*.bim_*.settings
*.rptproj.rsuser
*- [Bb]ackup.rdl
*- [Bb]ackup ([0-9]).rdl
*- [Bb]ackup ([0-9][0-9]).rdl
# Microsoft Fakes
FakesAssemblies/
# GhostDoc plugin setting file
*.GhostDoc.xml
# Node.js Tools for Visual Studio
.ntvs_analysis.dat
node_modules/
# Visual Studio 6 build log
*.plg
# Visual Studio 6 workspace options file
*.opt
# Visual Studio 6 auto-generated workspace file (contains which files were open etc.)
*.vbw
# Visual Studio 6 auto-generated project file (contains which files were open etc.)
*.vbp
# Visual Studio 6 workspace and project file (working project files containing files to include in project)
*.dsw
*.dsp
# Visual Studio 6 technical files
*.ncb
*.aps
# Visual Studio LightSwitch build output
**/*.HTMLClient/GeneratedArtifacts
**/*.DesktopClient/GeneratedArtifacts
**/*.DesktopClient/ModelManifest.xml
**/*.Server/GeneratedArtifacts
**/*.Server/ModelManifest.xml
_Pvt_Extensions
# Paket dependency manager
.paket/paket.exe
paket-files/
# FAKE - F# Make
.fake/
# CodeRush personal settings
.cr/personal
# Python Tools for Visual Studio (PTVS)
__pycache__/
*.pyc
# Cake - Uncomment if you are using it
# tools/**
# !tools/packages.config
# Tabs Studio
*.tss
# Telerik's JustMock configuration file
*.jmconfig
# BizTalk build output
*.btp.cs
*.btm.cs
*.odx.cs
*.xsd.cs
# OpenCover UI analysis results
OpenCover/
# Azure Stream Analytics local run output
ASALocalRun/
# MSBuild Binary and Structured Log
*.binlog
# NVidia Nsight GPU debugger configuration file
*.nvuser
# MFractors (Xamarin productivity tool) working folder
.mfractor/
# Local History for Visual Studio
.localhistory/
# Visual Studio History (VSHistory) files
.vshistory/
# BeatPulse healthcheck temp database
healthchecksdb
# Backup folder for Package Reference Convert tool in Visual Studio 2017
MigrationBackup/
# Ionide (cross platform F# VS Code tools) working folder
.ionide/
# Fody - auto-generated XML schema
FodyWeavers.xsd
# VS Code files for those working on multiple tools
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
*.code-workspace
# Local History for Visual Studio Code
.history/
# Windows Installer files from build outputs
*.cab
*.msi
*.msix
*.msm
*.msp
# JetBrains Rider
*.sln.iml
.idea
##
## Visual studio for Mac
##
# globs
Makefile.in
*.userprefs
*.usertasks
config.make
config.status
aclocal.m4
install-sh
autom4te.cache/
*.tar.gz
tarballs/
test-results/
# Mac bundle stuff
*.dmg
*.app
# content below from: https://github.com/github/gitignore/blob/master/Global/macOS.gitignore
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
# content below from: https://github.com/github/gitignore/blob/master/Global/Windows.gitignore
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
[Dd]esktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msix
*.msm
*.msp
# Windows shortcuts
*.lnk
# Vim temporary swap files
*.swp

View File

@ -0,0 +1,14 @@
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /App
# Copy everything
COPY . ./
# Restore as distinct layers
RUN dotnet restore
# Build and publish a release
RUN dotnet publish -c Release -o out
FROM mcr.microsoft.com/dotnet/runtime:8.0 AS runtime
WORKDIR /App
COPY --from=build /App/out .
ENTRYPOINT ["dotnet", "skibidi_app_2.dll"]

View File

@ -0,0 +1,16 @@
let INPUT_FILE = @"/var/result/data.txt"
let OUTPUT_FILE = @"/var/result/result.txt"
let getNumbersFromFile(path: string): seq<int> =
System.IO.File.ReadLines(path)
|> Seq.map int
let getCountOfMaxNumber(numbers: seq<int>): int =
numbers
|> Seq.max
|> fun maxNum -> Seq.filter ((=) maxNum) numbers
|> Seq.length
let numbers = getNumbersFromFile(INPUT_FILE)
let count = getCountOfMaxNumber(numbers)
System.IO.File.WriteAllText(OUTPUT_FILE, string count)

View File

@ -0,0 +1,4 @@
# Вторая программа лабораторной работы номер 2
> Вариант 3
>
> Ищет набольшее число из файла `/var/result/data.txt` и сохраняет количество таких чисел из последовательности в `/var/result/result.txt`

View File

@ -0,0 +1,12 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<Compile Include="Program.fs" />
</ItemGroup>
</Project>

2
bazunov_andrew_lab_1/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
ollama
./ollama

View File

@ -0,0 +1,33 @@
# Распределенные вычисления и приложения Л1
## _Автор Базунов Андрей Игревич ПИбд-42_
В качестве сервисов были выбраны:
- 1.Ollama (_Сервис для использования LLMs моделей_)
- 2.Open Web Ui (_Сервис для удобного общения с моделью из сервиса Ollama_)
- 3.Gitea (_Гит сервис_)
# Docker
>Перед исполнением вполняем установку docker и проверяем версию
```sh
docker-compose --version
```
>Далее производим настройку файла docker-compose.yaml и запускаем контейнер
```sh
docker-compose up -d
```
>Для завершения работы контейнера используем команду
```sh
docker-compose down
```
---
> Замечание: после запуска контейнера, необходимо перейти в контейнер **ollamа** и выполнить установку модели [gemma2](https://ollama.com/library/gemma2:2b)
> ```sh
> docker-compose exec ollama ollama run ollama run gemma2:2b
> ```
---
Далее можно использовать веб сервис Open Web Ui по адресу **localhost:8080** для общения с моделью и Gitea по адресу **localhost:3000** - [демонстрация работы](https://vk.com/video/@viltskaa?z=video236673313_456239574%2Fpl_236673313_-2)

View File

@ -0,0 +1,61 @@
services:
gitea: # Имя сервиса
image: gitea/gitea:latest # Имя образа
container_name: gitea # Имя контейнера, может быть произовольным
ports:
- "3000:3000" # Проброс порта Gitea на хост
volumes: # хранилище
- data:/data
environment: # переменные окружения
USER_UID: 1000
USER_GID: 1000
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: always
ports:
- 7869:11434
pull_policy: always
tty: true
volumes:
- .:/code
- ./ollama/ollama:/root/.ollama # Директория для данных Ollama
environment:
- OLLAMA_KEEP_ALIVE=24h
- OLLAMA_HOST=0.0.0.0 # Указываем хост для API Ollama
networks:
- ollama-docker
command: ["serve"] # Запускаем Ollama в режиме сервера
ollama-webui:
image: ghcr.io/open-webui/open-webui:main # Образ Open Web UI
container_name: ollama-webui
restart: unless-stopped
volumes:
- ./ollama/ollama-webui:/app/backend/data
ports:
- 8080:8080 # Порт для веб-интерфейса
environment: # https://docs.openwebui.com/getting-started/env-configuration#default_models
- OLLAMA_BASE_URLS=http://host.docker.internal:7869
- ENV=dev
- WEBUI_AUTH=False
- WEBUI_NAME=Viltskaa AI
- WEBUI_URL=http://localhost:8080
- WEBUI_SECRET_KEY=t0p-s3cr3t
depends_on:
- ollama
extra_hosts:
- host.docker.internal:host-gateway
networks:
- ollama-docker
networks:
ollama-docker:
external: false
volumes:
ollama:
driver: local
data:
driver: local

View File

@ -0,0 +1,14 @@
# Используем официальный образ Go в качестве базового
FROM golang:1.23
# Устанавливаем рабочую директорию
WORKDIR /app
# Копируем файлы модуля
COPY . .
# Сборка модуля
RUN go build -o /bin/FileCreator
# Запуск модуля
CMD ["/bin/FileCreator"]

View File

@ -0,0 +1 @@
module FileCreator

View File

@ -0,0 +1,92 @@
package main
import (
"crypto/md5"
"encoding/hex"
"fmt"
"math/rand"
"os"
"path/filepath"
)
const DIR = "/var/data"
func Exists(name string) (bool, error) {
_, err := os.Stat(name)
if os.IsNotExist(err) {
return false, nil
}
return err != nil, err
}
func CreateDirectory(dirs string) error {
if _, err := os.Stat(dirs); os.IsNotExist(err) {
err := os.MkdirAll(dirs, 0664)
if err != nil {
return err
}
}
return nil
}
func CreateFileOrOpenIfExist(name string) (*os.File, error) {
err := CreateDirectory(filepath.Dir(name))
if err != nil {
return nil, err
}
exists, err := Exists(name)
if err != nil {
return nil, err
}
if exists {
return os.OpenFile(name, os.O_WRONLY|os.O_CREATE, 0664)
}
return os.Create(name)
}
func CreateFileAndWriteData(filename string) error {
file, err := CreateFileOrOpenIfExist(filename)
if err != nil {
return err
}
lines := rand.Intn(1000) + 100
for i := 0; i < lines; i++ {
randomValueForLine := rand.Intn(1_000_000)
_, err = fmt.Fprintf(file, "%d\r\n", randomValueForLine)
if err != nil {
return err
}
}
err = file.Close()
if err != nil {
return err
}
return nil
}
func GetMD5Hash(text string) string {
hash := md5.Sum([]byte(text))
return hex.EncodeToString(hash[:])
}
func main() {
for i := 0; i < 10; i++ {
filename := fmt.Sprintf("%s/%s.txt", DIR, GetMD5Hash(fmt.Sprintf("%d", i)))
err := CreateFileAndWriteData(filename)
if err != nil {
fmt.Println(err)
} else {
fmt.Printf("Created file %s\n", filename)
}
}
err := CreateFileAndWriteData(DIR + "/data.txt")
if err != nil {
fmt.Println(err)
} else {
fmt.Printf("Created file %s\n", DIR+"/data.txt")
}
}

View File

@ -0,0 +1,14 @@
# Используем официальный образ Go в качестве базового
FROM golang:1.23
# Устанавливаем рабочую директорию
WORKDIR /app
# Копируем файлы модуля
COPY . .
# Сборка модуля
RUN go build -o /bin/FirstService
# Запуск модуля
CMD ["/bin/FirstService"]

View File

@ -0,0 +1 @@
module RVIP2

View File

@ -0,0 +1,94 @@
package main
import (
"fmt"
"os"
)
// Формирует файл /var/result/data.txt так,
// что каждая строка файла - количество символов в именах файлов из каталога /var/data.
const INPUT = "/var/data"
const OUTPUT = "/data/result"
func GetListFilesInDirectory(directory string) ([]string, error) {
f, err := os.Open(directory)
if err != nil {
fmt.Println(err)
return nil, err
}
files, err := f.Readdir(0)
if err != nil {
fmt.Println(err)
return nil, err
}
var fileNames []string
for _, file := range files {
fileName := file.Name()
fileNames = append(fileNames, fileName)
}
return fileNames, nil
}
func Exists(name string) (bool, error) {
_, err := os.Stat(name)
if os.IsNotExist(err) {
return false, nil
}
return err != nil, err
}
func CreateFileOrOpenIfExist(name string) (*os.File, error) {
exists, err := Exists(name)
if err != nil {
return nil, err
}
if exists {
return os.OpenFile(name, os.O_WRONLY|os.O_CREATE, 0664)
}
return os.Create(name)
}
func CreateFileAndWriteData(filename string, lines []string) error {
file, err := CreateFileOrOpenIfExist(filename)
if err != nil {
return err
}
for _, line := range lines {
_, err = fmt.Fprintf(file, line)
if err != nil {
return err
}
}
err = file.Close()
if err != nil {
return err
}
return nil
}
func main() {
filenames, err := GetListFilesInDirectory(INPUT)
if err != nil {
fmt.Println(err)
return
}
var lenghtOfFilenames []string
for _, filename := range filenames {
fmt.Println(filename)
lenghtOfFilenames = append(lenghtOfFilenames, fmt.Sprintf("%d", len(filename)))
}
err = CreateFileAndWriteData(OUTPUT+"/data.txt", filenames)
if err != nil {
return
}
fmt.Println("First Service is end.")
}

View File

@ -0,0 +1,14 @@
# Используем официальный образ Go в качестве базового
FROM golang:1.23
# Устанавливаем рабочую директорию
WORKDIR /app
# Копируем файлы модуля
COPY . .
# Сборка модуля
RUN go build -o /bin/SecondService
# Запуск модуля
CMD ["/bin/SecondService"]

View File

@ -0,0 +1 @@
module SecondService

View File

@ -0,0 +1,79 @@
package main
import (
"bufio"
"fmt"
"os"
)
//Ищет наименьшее число из файла /var/data/data.txt и сохраняет его третью степень в /var/result/result.txt.
const INPUT = "/var/data/data.txt"
const OUTPUT = "/var/result/result.txt"
func ReadlinesFromFile(filename string) ([]string, error) {
file, err := os.Open(filename)
if err != nil {
return nil, err
}
var output []string
scanner := bufio.NewScanner(file)
for scanner.Scan() {
output = append(output, scanner.Text())
}
err = file.Close()
if err != nil {
return nil, err
}
return output, nil
}
func WriteIntToFile(filename string, i int) error {
file, err := os.Create(filename)
if err != nil {
return err
}
defer func(file *os.File) {
err := file.Close()
if err != nil {
}
}(file)
_, err = file.WriteString(fmt.Sprintf("%d\n", i))
if err != nil {
return err
}
return nil
}
func main() {
lines, err := ReadlinesFromFile(INPUT)
if err != nil {
fmt.Println(err)
}
minValue := 0
for _, line := range lines {
if intValue, err := fmt.Sscanf(line, "%d", &minValue); err != nil {
fmt.Println(err)
} else {
if minValue >= intValue {
minValue = intValue
}
}
}
if err = WriteIntToFile(OUTPUT, minValue); err != nil {
return
} else {
fmt.Printf("Write %d to %s\n", minValue, OUTPUT)
}
fmt.Println("Second Service is end.")
}

View File

@ -0,0 +1,27 @@
services:
file_generate:
build:
context: ./FileCreator
dockerfile: Dockerfile
volumes:
- ./data:/var/data # Монтирование локальной папки data в /var/data в контейнере
first_service:
build:
context: ./FirstService
dockerfile: Dockerfile
volumes:
- ./data:/var/data
- ./data:/var/result
depends_on:
- file_generate
second_service:
build:
context: ./SecondService
dockerfile: Dockerfile
volumes:
- ./data:/var/data
- ./data:/var/result
depends_on:
- first_service

BIN
bazunov_andrew_lab_3/PersonApp/.DS_Store vendored Normal file

Binary file not shown.

View File

@ -0,0 +1,4 @@
PORT=8080
TASK_APP_URL=http://task-app:8000
TIMEOUT=15
DATABASE=./database.db

View File

@ -0,0 +1,14 @@
FROM golang:1.23
WORKDIR /app
COPY go.mod go.sum ./
RUN go mod download
COPY . .
RUN go build -o /bin/PersonApp
EXPOSE 8080
CMD ["/bin/PersonApp"]

Binary file not shown.

View File

@ -0,0 +1,10 @@
module PersonApp
go 1.23.1
require (
github.com/gorilla/mux v1.8.1
github.com/mattn/go-sqlite3 v1.14.24
)
require github.com/joho/godotenv v1.5.1 // indirect

View File

@ -0,0 +1,6 @@
github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
github.com/mattn/go-sqlite3 v1.14.24 h1:tpSp2G2KyMnnQu99ngJ47EIkWVmliIizyZBfPrBWDRM=
github.com/mattn/go-sqlite3 v1.14.24/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=

View File

@ -0,0 +1,157 @@
package handlers
import (
"PersonApp/httpClient"
"PersonApp/models"
"PersonApp/repository"
"encoding/json"
"fmt"
"github.com/gorilla/mux"
"net/http"
"strconv"
)
func InitRoutes(r *mux.Router, rep repository.PersonRepository, cln httpClient.Client) {
r.HandleFunc("/", GetPersons(rep, cln)).Methods("GET")
r.HandleFunc("/{id:[0-9]+}", GetPersonById(rep, cln)).Methods("GET")
r.HandleFunc("/", CreatePerson(rep)).Methods("POST")
r.HandleFunc("/{id:[0-9]+}", UpdatePerson(rep)).Methods("PUT")
r.HandleFunc("/{id:[0-9]+}", DeletePerson(rep)).Methods("DELETE")
}
func GetPersons(rep repository.PersonRepository, cln httpClient.Client) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
fmt.Println("GET PERSONS")
persons, err := rep.GetAllPersons()
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
for i := 0; i < len(persons); i++ {
tasks, _ := cln.GetPersonTasks(persons[i].Id)
persons[i].Tasks = tasks
}
err = json.NewEncoder(w).Encode(persons)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func GetPersonById(rep repository.PersonRepository, cln httpClient.Client) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
person, err := rep.GetPersonById(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
tasks, err := cln.GetPersonTasks(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
} else {
person.Tasks = tasks
}
err = json.NewEncoder(w).Encode(person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func CreatePerson(rep repository.PersonRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
var person *models.Person
err := json.NewDecoder(r.Body).Decode(&person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
person, err = rep.CreatePerson(*person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.WriteHeader(http.StatusCreated)
err = json.NewEncoder(w).Encode(person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func UpdatePerson(rep repository.PersonRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
var person *models.Person
err = json.NewDecoder(r.Body).Decode(&person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
person, err = rep.UpdatePerson(models.Person{
Id: id,
Name: person.Name,
Tasks: nil,
})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.WriteHeader(http.StatusAccepted)
err = json.NewEncoder(w).Encode(person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
}
}
}
func DeletePerson(rep repository.PersonRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
err = rep.DeletePerson(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.WriteHeader(http.StatusOK)
}
}

View File

@ -0,0 +1,72 @@
package httpClient
import (
"PersonApp/models"
"encoding/json"
"fmt"
"io"
"net/http"
"time"
)
type Client interface {
GetPersonTasks(id int) ([]models.Task, error)
TestConnection() (bool, error)
}
type client struct {
BaseUrl string
Timeout time.Duration
}
func (c *client) TestConnection() (bool, error) {
client := &http.Client{Timeout: c.Timeout}
url := fmt.Sprintf("%s/", c.BaseUrl)
resp, err := client.Get(url)
if err != nil {
return false, err
}
defer func(Body io.ReadCloser) {
err := Body.Close()
if err != nil {
return
}
}(resp.Body)
if resp.StatusCode != http.StatusOK {
return false, fmt.Errorf("bad status code: %d", resp.StatusCode)
}
return true, nil
}
func (c *client) GetPersonTasks(id int) ([]models.Task, error) {
client := &http.Client{Timeout: c.Timeout * time.Second}
url := fmt.Sprintf("%s/f/%d", c.BaseUrl, id)
resp, err := client.Get(url)
if err != nil {
return nil, err
}
defer func(Body io.ReadCloser) {
err := Body.Close()
if err != nil {
}
}(resp.Body)
body, _ := io.ReadAll(resp.Body)
var tasks []models.Task
if err := json.Unmarshal(body, &tasks); err != nil {
fmt.Printf("Unmarshal error: %s", err)
return []models.Task{}, err
}
return tasks, nil
}
func NewClient(baseUrl string, timeout time.Duration) Client {
return &client{BaseUrl: baseUrl, Timeout: timeout}
}

View File

@ -0,0 +1,34 @@
GET http://localhost/person-app/
Accept: application/json
###
GET http://localhost/person-app/1
Accept: application/json
###
POST http://localhost/person-app/
Accept: application/json
Content-Type: application/json
{
"name": "TEST3"
}
###
PUT http://localhost/person-app/3
Accept: application/json
Content-Type: application/json
{
"name": "TEST11"
}
###
DELETE http://localhost/person-app/3
Accept: application/json
###

View File

@ -0,0 +1,47 @@
package main
import (
"PersonApp/handlers"
"PersonApp/httpClient"
"PersonApp/repository"
"PersonApp/storage"
"github.com/gorilla/mux"
"github.com/joho/godotenv"
"net/http"
"os"
"strconv"
"time"
)
func main() {
err := godotenv.Load(".env")
if err != nil {
panic("Error loading .env file")
}
url := os.Getenv("TASK_APP_URL")
port := os.Getenv("PORT")
databasePath := os.Getenv("DATABASE")
timeout, err := strconv.Atoi(os.Getenv("TIMEOUT"))
if err != nil {
panic("Error converting timeout to int")
}
database, err := storage.Init(databasePath)
if err != nil {
panic(err)
}
cln := httpClient.NewClient(url, time.Duration(timeout))
rep := repository.NewPersonRepository(database)
router := mux.NewRouter()
handlers.InitRoutes(router, rep, cln)
err = http.ListenAndServe(":"+port, router)
if err != nil {
panic(err)
}
storage.Close(database)
}

View File

@ -0,0 +1,24 @@
package models
type Person struct {
Id int `json:"id"`
Name string `json:"name"`
Tasks []Task `json:"tasks"`
}
type PersonCreate struct {
Name string `json:"name"`
}
type Task struct {
Id int `json:"id"`
PersonId int `json:"person_id"`
Name string `json:"name"`
Date string `json:"date"`
}
type TaskCreate struct {
PersonId int `json:"person_id"`
Name string `json:"name"`
Date string `json:"date"`
}

View File

@ -0,0 +1,99 @@
package repository
import (
"PersonApp/models"
"database/sql"
)
type PersonRepository interface {
GetAllPersons() ([]models.Person, error)
GetPersonById(id int) (*models.Person, error)
CreatePerson(person models.Person) (*models.Person, error)
UpdatePerson(person models.Person) (*models.Person, error)
DeletePerson(id int) error
}
type personRepository struct {
DB *sql.DB
}
func NewPersonRepository(db *sql.DB) PersonRepository {
return &personRepository{DB: db}
}
func (pr *personRepository) GetAllPersons() ([]models.Person, error) {
rows, err := pr.DB.Query("select * from Persons")
if err != nil {
return nil, err
}
defer func(rows *sql.Rows) {
err := rows.Close()
if err != nil {
panic(err)
}
}(rows)
var persons []models.Person
for rows.Next() {
p := models.Person{}
err := rows.Scan(&p.Id, &p.Name)
if err != nil {
panic(err)
}
persons = append(persons, p)
}
return persons, err
}
func (pr *personRepository) GetPersonById(id int) (*models.Person, error) {
row := pr.DB.QueryRow("select * from Persons where id=?", id)
person := models.Person{}
err := row.Scan(&person.Id, &person.Name)
if err != nil {
return nil, err
}
return &person, err
}
func (pr *personRepository) CreatePerson(p models.Person) (*models.Person, error) {
res, err := pr.DB.Exec("INSERT INTO Persons (name) values (?)", p.Name)
if err != nil {
return nil, err
}
if res == nil {
return nil, nil
}
return &p, err
}
func (pr *personRepository) UpdatePerson(p models.Person) (*models.Person, error) {
res, err := pr.DB.Exec("UPDATE Persons SET name = ? WHERE id = ?", p.Name, p.Id)
if err != nil {
return nil, err
}
if res == nil {
return nil, nil
}
return &p, err
}
func (pr *personRepository) DeletePerson(id int) error {
_, err := pr.DB.Exec("DELETE FROM Persons WHERE id = ?", id)
if err != nil {
return err
}
return nil
}

View File

@ -0,0 +1,36 @@
package storage
import (
"database/sql"
_ "github.com/mattn/go-sqlite3"
)
func Init(databasePath string) (*sql.DB, error) {
db, err := sql.Open("sqlite3", databasePath)
if err != nil || db == nil {
return nil, err
}
if err := createTableIfNotExists(db); err != nil {
return nil, err
}
return db, nil
}
func Close(db *sql.DB) {
err := db.Close()
if err != nil {
return
}
}
func createTableIfNotExists(db *sql.DB) error {
if result, err := db.Exec(
"CREATE TABLE IF NOT EXISTS `Persons`(Id integer primary key autoincrement, Name text not null);",
); err != nil || result == nil {
return err
}
return nil
}

View File

@ -0,0 +1,25 @@
# Распределенные вычисления и приложения Л3
## _Автор Базунов Андрей Игревич ПИбд-42_
В качестве основного языка был выбран GoLang. Для каждого сервиса был создан DOCKERFILE где были прописаны условия и действия для сборки каждого из модулей
# Docker
>Перед исполнением вполняем установку docker и проверяем версию
```sh
docker-compose --version
```
>Далее производим настройку файла docker-compose.yaml и запускаем контейнер с сборкой образов
```sh
docker-compose up -d --build
```
>Для завершения работы контейнера используем команду
```sh
docker-compose down
```
[Демонстрация работы](https://vk.com/video/@viltskaa?z=video236673313_456239577%2Fpl_236673313_-2)

View File

@ -0,0 +1,4 @@
PORT=8000
PERSON_APP_URL=http://person-app:8080
TIMEOUT=15
DATABASE=./database.db

View File

@ -0,0 +1,14 @@
FROM golang:1.23
WORKDIR /app
COPY go.mod go.sum ./
RUN go mod download
COPY . .
RUN go build -o /bin/TaskApp
EXPOSE 8000
CMD ["/bin/TaskApp"]

Binary file not shown.

View File

@ -0,0 +1,10 @@
module TaskApp
go 1.23.1
require (
github.com/gorilla/mux v1.8.1
github.com/mattn/go-sqlite3 v1.14.24
)
require github.com/joho/godotenv v1.5.1

View File

@ -0,0 +1,6 @@
github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
github.com/mattn/go-sqlite3 v1.14.24 h1:tpSp2G2KyMnnQu99ngJ47EIkWVmliIizyZBfPrBWDRM=
github.com/mattn/go-sqlite3 v1.14.24/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=

View File

@ -0,0 +1,177 @@
package handlers
import (
"TaskApp/httpClient"
"TaskApp/models"
"TaskApp/repository"
"encoding/json"
"fmt"
"github.com/gorilla/mux"
"net/http"
"strconv"
)
func InitRoutes(r *mux.Router, rep repository.TaskRepository, cln httpClient.Client) {
r.HandleFunc("/", GetTasks(rep)).Methods("GET")
r.HandleFunc("/{id:[0-9]+}", GetTaskById(rep)).Methods("GET")
r.HandleFunc("/", CreateTask(rep, cln)).Methods("POST")
r.HandleFunc("/{id:[0-9]+}", UpdateTask(rep)).Methods("PUT")
r.HandleFunc("/{id:[0-9]+}", DeleteTask(rep)).Methods("DELETE")
r.HandleFunc("/f/{id:[0-9]+}", GetPersonTasks(rep)).Methods("GET")
}
func GetTasks(rep repository.TaskRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
tasks, err := rep.GetAllTasks()
if err != nil {
w.WriteHeader(http.StatusInternalServerError)
return
}
err = json.NewEncoder(w).Encode(tasks)
if err != nil {
w.WriteHeader(http.StatusInternalServerError)
}
}
}
func GetTaskById(rep repository.TaskRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
person, err := rep.GetTaskById(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
err = json.NewEncoder(w).Encode(person)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func GetPersonTasks(rep repository.TaskRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
tasks, err := rep.GetUserTasks(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
err = json.NewEncoder(w).Encode(tasks)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func CreateTask(rep repository.TaskRepository, cln httpClient.Client) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
var task *models.TaskCreate
err := json.NewDecoder(r.Body).Decode(&task)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
if &task.Name == nil || &task.PersonId == nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
person, err := cln.GetPerson(task.PersonId)
if err != nil {
fmt.Println(err)
http.Error(w, "Connection to PersonApp is confused.", http.StatusInternalServerError)
return
}
if person == nil {
http.Error(w, fmt.Sprintf("Person with id=%d is't founded.", person.Id), http.StatusBadGateway)
return
}
newTask, err := rep.CreateTask(*task)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.WriteHeader(http.StatusCreated)
err = json.NewEncoder(w).Encode(newTask)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
}
}
func UpdateTask(rep repository.TaskRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
var task *models.TaskCreate
err = json.NewDecoder(r.Body).Decode(&task)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
newTask, err := rep.UpdateTask(models.Task{Id: id, Name: task.Name, Date: task.Date})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
err = json.NewEncoder(w).Encode(newTask)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
}
}
}
func DeleteTask(rep repository.TaskRepository) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
id, err := strconv.Atoi(mux.Vars(r)["id"])
if err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
err = rep.DeleteTask(id)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.WriteHeader(http.StatusOK)
}
}

View File

@ -0,0 +1,73 @@
package httpClient
import (
"TaskApp/models"
"encoding/json"
"fmt"
"io"
"log"
"net/http"
"time"
)
type Client interface {
GetPerson(id int) (*models.Person, error)
TestConnection() (bool, error)
}
type client struct {
BaseUrl string
Timeout time.Duration
}
func (c *client) TestConnection() (bool, error) {
client := &http.Client{Timeout: c.Timeout}
url := fmt.Sprintf("%s/", c.BaseUrl)
resp, err := client.Get(url)
if err != nil {
return false, err
}
defer func(Body io.ReadCloser) {
err := Body.Close()
if err != nil {
return
}
}(resp.Body)
if resp.StatusCode != http.StatusOK {
return false, fmt.Errorf("bad status code: %d", resp.StatusCode)
}
return true, nil
}
func (c *client) GetPerson(id int) (*models.Person, error) {
client := &http.Client{Timeout: c.Timeout * time.Second}
url := fmt.Sprintf("%s/%d", c.BaseUrl, id)
resp, err := client.Get(url)
if err != nil {
return nil, err
}
defer func(Body io.ReadCloser) {
err := Body.Close()
if err != nil {
}
}(resp.Body)
body, _ := io.ReadAll(resp.Body)
var person models.Person
if err := json.Unmarshal(body, &person); err != nil {
log.Printf("Unmarshal error: %s", err)
return nil, err
}
return &person, nil
}
func NewClient(baseUrl string, timeout time.Duration) Client {
return &client{BaseUrl: baseUrl, Timeout: timeout}
}

View File

@ -0,0 +1,37 @@
GET http://localhost/task-app/
Accept: application/json
###
GET http://localhost/task-app/4
Accept: application/json
###
POST http://localhost/task-app/
Accept: application/json
Content-Type: application/json
{
"name": "TEST2",
"person_id": 1,
"date": "19.02.2202"
}
###
PUT http://localhost/task-app/4
Accept: application/json
Content-Type: application/json
{
"name": "TEST5",
"date": "19.02.2202"
}
###
DELETE http://localhost/task-app/4
Accept: application/json
###

View File

@ -0,0 +1,47 @@
package main
import (
"TaskApp/handlers"
"TaskApp/httpClient"
"TaskApp/repository"
"TaskApp/storage"
"github.com/gorilla/mux"
"github.com/joho/godotenv"
"net/http"
"os"
"strconv"
"time"
)
func main() {
err := godotenv.Load(".env")
if err != nil {
panic("Error loading .env file")
}
url := os.Getenv("PERSON_APP_URL")
port := os.Getenv("PORT")
databasePath := os.Getenv("DATABASE")
timeout, err := strconv.Atoi(os.Getenv("TIMEOUT"))
if err != nil {
panic("Error converting timeout to int")
}
database, err := storage.Init(databasePath)
if err != nil {
panic(err)
}
cln := httpClient.NewClient(url, time.Duration(timeout))
rep := repository.NewTaskRepository(database)
router := mux.NewRouter()
handlers.InitRoutes(router, rep, cln)
err = http.ListenAndServe(":"+port, router)
if err != nil {
panic(err)
}
storage.Close(database)
}

View File

@ -0,0 +1,24 @@
package models
type Person struct {
Id int `json:"id"`
Name string `json:"name"`
Tasks []Task `json:"tasks"`
}
type PersonCreate struct {
Name string `json:"name"`
}
type Task struct {
Id int `json:"id"`
PersonId int `json:"person_id"`
Name string `json:"name"`
Date string `json:"date"`
}
type TaskCreate struct {
PersonId int `json:"person_id"`
Name string `json:"name"`
Date string `json:"date"`
}

View File

@ -0,0 +1,121 @@
package repository
import (
"TaskApp/models"
"database/sql"
)
type TaskRepository interface {
GetAllTasks() ([]models.Task, error)
GetTaskById(id int) (*models.Task, error)
GetUserTasks(id int) ([]models.Task, error)
CreateTask(task models.TaskCreate) (*models.Task, error)
UpdateTask(task models.Task) (*models.Task, error)
DeleteTask(id int) error
}
type taskRepository struct {
DB *sql.DB
}
func (t taskRepository) GetUserTasks(id int) ([]models.Task, error) {
rows, err := t.DB.Query("select * from Tasks where PersonId = ?", id)
if err != nil {
return nil, err
}
defer func(rows *sql.Rows) {
err := rows.Close()
if err != nil {
panic(err)
}
}(rows)
var tasks []models.Task
for rows.Next() {
p := models.Task{}
err := rows.Scan(&p.Id, &p.Name, &p.PersonId, &p.Date)
if err != nil {
panic(err)
}
tasks = append(tasks, p)
}
return tasks, err
}
func (t taskRepository) GetAllTasks() ([]models.Task, error) {
rows, err := t.DB.Query("select * from Tasks")
if err != nil {
return nil, err
}
defer func(rows *sql.Rows) {
err := rows.Close()
if err != nil {
panic(err)
}
}(rows)
var tasks []models.Task
for rows.Next() {
p := models.Task{}
err := rows.Scan(&p.Id, &p.Name, &p.PersonId, &p.Date)
if err != nil {
panic(err)
}
tasks = append(tasks, p)
}
return tasks, err
}
func (t taskRepository) GetTaskById(id int) (*models.Task, error) {
row := t.DB.QueryRow("select * from Tasks where id=?", id)
task := models.Task{}
err := row.Scan(&task.Id, &task.Name, &task.PersonId, &task.Date)
if err != nil {
return nil, err
}
return &task, err
}
func (t taskRepository) CreateTask(task models.TaskCreate) (*models.Task, error) {
_, err := t.DB.Exec("INSERT INTO Tasks(Name, PersonId, Date) VALUES (?, ?, ?)", task.Name, task.PersonId, task.Date)
if err != nil {
return nil, err
}
return &models.Task{
Id: 0,
PersonId: task.PersonId,
Name: task.Name,
Date: task.Date,
}, err
}
func (t taskRepository) UpdateTask(task models.Task) (*models.Task, error) {
_, err := t.DB.Exec("UPDATE Tasks SET name = ?, date = ? WHERE id = ?", task.Name, task.Date, task.Id)
if err != nil {
return nil, err
}
return &task, err
}
func (t taskRepository) DeleteTask(id int) error {
_, err := t.DB.Exec("DELETE FROM Tasks WHERE id = ?", id)
if err != nil {
return err
}
return nil
}
func NewTaskRepository(db *sql.DB) TaskRepository {
return &taskRepository{DB: db}
}

View File

@ -0,0 +1,36 @@
package storage
import (
"database/sql"
_ "github.com/mattn/go-sqlite3"
)
func Init(databasePath string) (*sql.DB, error) {
db, err := sql.Open("sqlite3", databasePath)
if err != nil || db == nil {
return nil, err
}
if err := createTableIfNotExists(db); err != nil {
return nil, err
}
return db, nil
}
func Close(db *sql.DB) {
err := db.Close()
if err != nil {
return
}
}
func createTableIfNotExists(db *sql.DB) error {
if result, err := db.Exec(
"CREATE TABLE IF NOT EXISTS `Tasks`(Id integer primary key autoincrement, Name text not null, PersonId integer not null, Date text not null);",
); err != nil || result == nil {
return err
}
return nil
}

View File

@ -0,0 +1,34 @@
services:
person-app:
build:
context: ./PersonApp
dockerfile: Dockerfile
networks:
- network
ports:
- "8080:8080"
task-app:
build:
context: ./TaskApp
dockerfile: Dockerfile
networks:
- network
ports:
- "8000:8000"
nginx:
image: nginx
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
networks:
- network
depends_on:
- person-app
- task-app
networks:
network:
driver: bridge

View File

@ -0,0 +1,59 @@
events {
worker_connections 1024;
}
http {
server {
listen 80;
server_name localhost;
location /person-app/ {
proxy_pass http://person-app:8080/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'Origin, Content-Type, Accept, Authorization';
}
location /task-app/ {
proxy_pass http://task-app:8000/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'Origin, Content-Type, Accept, Authorization';
}
# Прокси для Swagger (Stream-сервис)
#location /stream-service/swagger/ {
# proxy_pass http://stream-service:8000/swagger/;
# proxy_set_header Host $host;
# proxy_set_header X-Real-IP $remote_addr;
# proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
# proxy_set_header X-Forwarded-Proto $scheme;
#}
# Прокси для Swagger (Message-сервис)
#location /message-service/swagger/ {
# proxy_pass http://message-service:8080/swagger/;
# proxy_set_header Host $host;
# proxy_set_header X-Real-IP $remote_addr;
# proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
# proxy_set_header X-Forwarded-Proto $scheme;
#}
#location /stream-service/doc.json {
# proxy_pass http://stream-service:8000/doc.json;
#}
#location /message-service/doc.json {
# proxy_pass http://message-service:8080/doc.json;
#}
}
}

View File

@ -0,0 +1,34 @@
# Лабораторная работа №4: Работа с брокером сообщений (RabbitMQ)
## Цель
Изучение проектирования приложений с использованием брокера сообщений RabbitMQ.
---
## Задачи
> 1. **Установить RabbitMQ**
Установите RabbitMQ на локальный компьютер (или используйте Docker).
>- [Скачивание RabbitMQ](https://www.rabbitmq.com/download.html)
>- [Релизы RabbitMQ](https://github.com/rabbitmq/rabbitmq-server/releases/)
>- **Пройти уроки RabbitMQ**
>- Сделайте скриншоты, показывающие запуск `producer` и `consumer` и передачу сообщений.
---
## Первый урок
> ![img.png](static/img1.png)
---
## Второй урок
>![img.png](static/img2.png)
>![img_1.png](static/img3.png)
---
## Третий урок
> ![img.png](static/img4.png)
---
## Задача
>![img.png](static/img5.png)
> ![img.png](static/img.png)

View File

@ -0,0 +1,17 @@
version: "3.2"
services:
rabbitmq:
image: rabbitmq:3-management-alpine
container_name: 'rabbitmq'
ports:
- "5672:5672"
- "15672:15672"
volumes:
- ~/.docker-conf/rabbitmq/data/:/var/lib/rabbitmq/
- ~/.docker-conf/rabbitmq/log/:/var/log/rabbitmq
networks:
- rabbitmq_go_net
networks:
rabbitmq_go_net:
driver: bridge

View File

@ -0,0 +1,47 @@
from datetime import datetime
import random
import threading
import pika
import sys
_alphabet = [chr(i) for i in range(97, 123)]
def run_every_n_seconds(seconds, action, *args):
threading.Timer(seconds, run_every_n_seconds, [seconds, action] + list(args)).start()
action(*args)
def generate_message():
now = datetime.now()
current_time = now.strftime("%H:%M:%S")
return f"[{current_time}] " + "".join(random.choices(_alphabet, k=random.randint(1, 10)))
def send_message(channel_local):
message = generate_message()
channel_local.basic_publish(
exchange='vk_messages',
routing_key='vk_messages',
body=message,
properties=pika.BasicProperties(
delivery_mode=pika.DeliveryMode.Persistent
))
print(f"[vkAuthor] Sent {message}")
def main(conn: pika.BlockingConnection):
channel = conn.channel()
channel.exchange_declare(exchange='vk_messages', exchange_type='fanout')
run_every_n_seconds(1, send_message, channel)
if __name__ == '__main__':
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
try:
main(connection)
except KeyboardInterrupt:
connection.close()
sys.exit(0)

View File

@ -0,0 +1,44 @@
import sys
from datetime import datetime
import pika
_QUEUE_NAME = "vk_messages_queue"
_EXCHANGE_NAME = "vk_messages"
def main():
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(
exchange=_EXCHANGE_NAME,
exchange_type='fanout'
)
channel.queue_declare(queue=_QUEUE_NAME, exclusive=True)
channel.queue_bind(exchange=_EXCHANGE_NAME, queue=_QUEUE_NAME)
def callback(ch, method, properties, body):
now = datetime.now()
current_time = now.strftime("%H:%M:%S")
print(f"[vkReader] Received [{str(body)}] in [{current_time}]")
ch.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_consume(
queue=_QUEUE_NAME,
on_message_callback=callback,
auto_ack=False
)
print('[*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print('Interrupted')
sys.exit(0)

View File

@ -0,0 +1,47 @@
import time
import random
from datetime import datetime
import pika
import sys
_QUEUE_NAME = "vk_messages_queue_slow"
_EXCHANGE_NAME = "vk_messages"
def main():
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(
exchange=_EXCHANGE_NAME,
exchange_type='fanout'
)
channel.queue_declare(queue=_QUEUE_NAME, exclusive=True)
channel.queue_bind(exchange=_EXCHANGE_NAME, queue=_QUEUE_NAME)
def callback(ch, method, properties, body):
now = datetime.now()
current_time = now.strftime("%H:%M:%S")
print(f"[vkSlowReader] Received [{str(body)}] in [{current_time}]")
read_time = random.randint(2, 5)
time.sleep(read_time)
ch.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_consume(
queue=_QUEUE_NAME,
on_message_callback=callback,
auto_ack=False
)
print('[*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print('Interrupted')
sys.exit(0)

View File

@ -0,0 +1,25 @@
import pika
import sys
def main():
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
def callback(ch, method, properties, body):
print(f" [x] Received {body}")
channel.basic_consume(queue='hello', on_message_callback=callback, auto_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print('Interrupted')
sys.exit(0)

View File

@ -0,0 +1,11 @@
import pika
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
channel.basic_publish(exchange='', routing_key='hello', body='Hello World!')
print(" [x] Sent 'Hello World!'")
connection.close()

View File

@ -0,0 +1,19 @@
import pika
import sys
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
message = ' '.join(sys.argv[1:]) or "Hello World!"
channel.basic_publish(
exchange='',
routing_key='task_queue',
body=message,
properties=pika.BasicProperties(
delivery_mode=pika.DeliveryMode.Persistent
))
print(f" [x] Sent {message}")
connection.close()

View File

@ -0,0 +1,22 @@
import pika
import time
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(f" [x] Received {body.decode()}")
time.sleep(body.count(b'.'))
print(" [x] Done")
ch.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_qos(prefetch_count=1)
channel.basic_consume(queue='task_queue', on_message_callback=callback)
channel.start_consuming()

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 204 KiB

View File

@ -0,0 +1,13 @@
import pika
import sys
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='logs', exchange_type='fanout')
message = ' '.join(sys.argv[1:]) or "info: Hello World!"
channel.basic_publish(exchange='logs', routing_key='', body=message)
print(f" [x] Sent {message}")
connection.close()

View File

@ -0,0 +1,24 @@
import pika
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='logs', exchange_type='fanout')
result = channel.queue_declare(queue='', exclusive=True)
queue_name = result.method.queue
channel.queue_bind(exchange='logs', queue=queue_name)
print(' [*] Waiting for logs. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(f" [x] {body}")
channel.basic_consume(
queue=queue_name, on_message_callback=callback, auto_ack=True)
channel.start_consuming()

View File

@ -0,0 +1,35 @@
# Лабораторная работа №1
## Богданов Дмитрий ПИбд-42
### Для выполнения были развернуты следующие сервисы:
* PostgreSQL - база данных
* Mediawiki - движок вики
* Gitea - движок гита
### С использованием следующих технологий:
* git
* docker
* docker-compose
### Запуск лабораторной:
Необходимо перейти в папку с файлом docker-compose.yaml и ввести следующую команду:
```
docker-compose up -d
```
## Результат запуска:
```
[+] Running 4/4
✔ Network bogdanov_dmitry_lab_1_default Created 0.0s
✔ Container bogdanov_dmitry_lab_1-mediawiki-1 Started 0.7s
✔ Container bogdanov_dmitry_lab_1-git-1 Started 0.8s
✔ Container bogdanov_dmitry_lab_1-db-1 Started 0.7s
```
## Видео с результатом запуска:
Видео можно посмотреть по данной [ссылке](https://drive.google.com/file/d/1TES58HIeCnnKbtwWgED2oig4N7plBmol/view).

Some files were not shown because too many files have changed in this diff Show More