Compare commits

..

2 Commits

Author SHA1 Message Date
a18c5236ce aleikin_artem_lab_8 is ready 2024-11-20 17:05:26 +04:00
50c8956afd aleikin_artem_lab_8 2024-11-20 17:05:17 +04:00
452 changed files with 51 additions and 16113 deletions

15
.gitignore vendored Normal file
View File

@ -0,0 +1,15 @@
################################################################################
# Данный GITIGNORE-файл был автоматически создан Microsoft(R) Visual Studio.
################################################################################
/.vs
/aleikin_artem_lab_3
/aleikin_artem_lab_4
/aleikin_artem_lab_5/MultiplyMatrix
/aleikin_artem_lab_6/DerminantMatrix
/dozorova_alena_lab_2
/dozorova_alena_lab_3
/dozorova_alena_lab_3/PostService/obj/Debug/net6.0/.NETCoreApp,Version=v6.0.AssemblyAttributes.cs
/dozorova_alena_lab_4
/dozorova_alena_lab_5/ConsoleApp1/obj
/dozorova_alena_lab_6/ConsoleApp1/obj

View File

@ -1,44 +0,0 @@
# Лабораторная работа 1
## Описание
Данная лабораторная работа предназначена для настройки 3 сервисов — **Gitea**, **Redmine** и БД **MySQL**с использованием Docker Compose. **Gitea** — это лёгкая система контроля версий с веб-интерфейсом, а **Redmine** — система управления проектами и задачами, а также баг-трекер. **MySQL** — база данных, используемая Redmine для хранения данных.
## Запуск проекта
1. Убедитесь, что у вас установлены **Docker** и **Docker Compose**.
2. Клонируйте репозиторий с данным проектом или создайте файл `docker-compose.yml` с конфигурацией, представленной там.
3. В командной строке перейдите в директорию с файлом `docker-compose.yml`.
4. Запустите команды:
```bash
docker-compose up -d
```
Эта команда запустит контейнеры в фоновом режиме.
5. После запуска:
- Gitea будет доступен по адресу: [http://localhost:3000](http://localhost:3000)
- Redmine будет доступен по адресу: [http://localhost:8080](http://localhost:8080)
## Конфигурация
В файле `docker-compose.yml` определены следующие сервисы:
- **Gitea**:
- Запускается из официального образа `gitea/gitea:latest`.
- Использует `SQLite` для хранения данных.
- Настроен на порту 3000 для веб-доступа и 2222 для SSH.
- **Redmine**:
- Запускается из официального образа `redmine`.
- Подключен к базе данных MySQL.
- Доступен на порту 8080.
- **MySQL**:
- Запускается из образа `mysql:8.0`.
- Используется Redmine для хранения данных.
- Настроен с дефолтными пользователем, базой и паролем.
## Остановка проекта
Для остановки контейнеров запустите:
```bash
docker-compose down
```
Это завершит работу всех контейнеров и освободит порты.
## Примечания
- При необходимости вы можете изменить порты или другие параметры, отредактировав файл `docker-compose.yml`.
- Данные хранятся в именованных томах `gitea_data` и `db_data`, что позволяет сохранять данные при перезапуске контейнеров.
- Ссылка на демонстрацию работы программы: https://vk.com/video215756667_456239451?list=ln-AMZSRDejYptijuOt9u

View File

@ -1,46 +0,0 @@
version: '3.9' # Версия Docker Compose
services:
# Gitea сервис
gitea:
image: gitea/gitea:latest # Образ Gitea для запуска сервиса
container_name: gitea # Имя контейнера для удобства
environment: # Переменные среды
USER_UID: 1000 # UID пользователя внутри контейнера
USER_GID: 1000 # GID пользователя внутри контейнера
GITEA__database__DB_TYPE: sqlite3 # Тип бд (SQLite для простоты)
GITEA__database__PATH: /data/gitea/gitea.db # Путь к базе данных
GITEA__server__ROOT_URL: http://localhost:3000 # URL для доступа
GITEA__server__HTTP_PORT: 3000 # Порт для веб-интерфейса
volumes:
- gitea_data:/data # Монтирование директории данных для сохранения данных
ports:
- "3000:3000" # Порт для доступа к веб-интерфейсу Gitea
- "2222:22" # SSH порт для клонирования репозиториев
restart: always # Автоматический перезапуск контейнера в случае сбоя
# Redmine сервис
redmine:
image: redmine # Образ Redmine для запуска сервиса
restart: always # Автоматический перезапуск контейнера
ports:
- 8080:3000 # Порт для доступа к веб-интерфейсу
environment: # Переменные среды
REDMINE_DB_MYSQL: db # Имя хоста бд для подключения
REDMINE_DB_PASSWORD: example # Пароль для подключения к базе данных
# MySQL база данных для Redmine
db:
image: mysql:8.0 # Образ MySQL для бд
restart: always # Автоматический перезапуск контейнера
environment: # Переменные среды
MYSQL_ROOT_PASSWORD: example # Пароль пользователя root для MySQL
MYSQL_DATABASE: redmine # Имя бд для Redmine
MYSQL_USER: user # Пользователь MySQL
MYSQL_PASSWORD: password # Пароль для пользователя MySQL
volumes:
- db_data:/var/lib/mysql # Монтирование для сохранения данных бд
volumes: # Именованные тома
gitea_data: # Том для данных Gitea
db_data: # Том для данных MySQL

View File

@ -1,92 +0,0 @@
data/
##############################
## Java
##############################
.mtj.tmp/
*.class
*.jar
*.war
*.ear
*.nar
hs_err_pid*
replay_pid*
##############################
## Maven
##############################
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
pom.xml.next
pom.xml.bak
release.properties
dependency-reduced-pom.xml
buildNumber.properties
.mvn/timing.properties
.mvn/wrapper/maven-wrapper.jar
##############################
## Gradle
##############################
bin/
build/
.gradle
.gradletasknamecache
gradle-app.setting
!gradle-wrapper.jar
##############################
## IntelliJ
##############################
out/
.idea/
.idea_modules/
*.iml
*.ipr
*.iws
##############################
## Eclipse
##############################
.settings/
bin/
tmp/
.metadata
.classpath
.project
*.tmp
*.bak
*.swp
*~.nib
local.properties
.loadpath
.factorypath
##############################
## NetBeans
##############################
nbproject/private/
build/
nbbuild/
dist/
nbdist/
nbactions.xml
nb-configuration.xml
##############################
## Visual Studio Code
##############################
.vscode/
.code-workspace
##############################
## OS X
##############################
.DS_Store
##############################
## Miscellaneous
##############################
*.log

View File

@ -1,38 +0,0 @@
# Лабораторная работа 2
## Описание
Данная лабораторная работа предназначена для настройки 2 сервисов (простейшего распределенного приложения) с использованием Docker Compose. **FirstService** — ищет в каталоге /var/data файл с наибольшим количеством строк и перекладывает его в /var/result/data.txt. **SecondService** — ищет наименьшее число из файла /var/result/data.txt (сгенерирован 1-ым сервисом) и сохраняет его третью степень в /var/result/result.txt.
## Запуск проекта
1. Убедитесь, что у вас установлены **Docker** и **Docker Compose**.
2. Клонируйте репозиторий с данным проектом.
3. В командной строке перейдите в директорию с файлом `docker-compose.yml`.
4. Запустите команды:
```bash
docker-compose up -d
```
Эта команда запустит контейнеры в фоновом режиме.
5. После запуска:
- Посмотреть логи первого сервиса о том, что файл создался успешно.
- Посмотреть логи второго сервиса о том, что он обработал созданный первым сервисом файл.
## Конфигурация
В файле `docker-compose.yml` определены следующие сервисы:
- **FirstService**:
- Создает образ из директории `firstService`.
- Использует локальную директорию `/var/data` и общую `/var/result` для хранения данных.
- **SecondService**:
- Создает образ из директории `secondService`.
- Использует общую `/var/result` директорию для хранения данных.
- Запускается после первого сервиса.
## Остановка проекта
Для остановки контейнеров запустите:
```bash
docker-compose down
```
Это завершит работу всех контейнеров.
## Примечания
- При необходимости можно изменить директорию с данными или другие параметры, отредактировав файл `docker-compose.yml`.
- Ссылка на демонстрацию работы программы: https://vk.com/video215756667_456239452?list=ln-rAyQWJj8q7ezqCaZzL

View File

@ -1,17 +0,0 @@
version: '3.9'
services:
first-service:
build: ./firstService # Путь к докер-файлу 1 приложения
volumes:
- D:/java/DAS_2024_1/afanasev_dmitry_lab_2/data:/var/data # Монтируем директорию с данными
- common-volume:/var/result # Монтируем общую директорию (нужна 2-му сервису для работы)
second-service:
build: ./secondService # Путь к докер-файлу 2 приложения
volumes:
- common-volume:/var/result # Монтируем общую директорию (нужна 2-му сервису для работы)
depends_on:
- first-service # Запуск после первого сервиса
volumes: # Именованные тома
common-volume: # Общий для 2-ух сервисов

View File

@ -1,17 +0,0 @@
# Используем образ с Java 17
FROM bellsoft/liberica-openjdk-alpine:17.0.8
# Создаем директорию для исходных файлов
RUN mkdir /var/data
# Создаем директорию для приложения
WORKDIR /app
# Копируем файлы приложения в контейнер
COPY src /app/src
# Компилируем приложение
RUN javac /app/src/FirstService.java
# Определяем команду для запуска приложения
CMD ["java", "-cp", "/app/src", "FirstService"]

View File

@ -1,52 +0,0 @@
import java.io.*;
import java.nio.file.*;
public class FirstService {
// 1. Ищет в каталоге /var/data файл с наибольшим количеством строк и перекладывает его в /var/result/data.txt.
public static void main(String[] args) {
Path sourceDir = Paths.get("/var/data");
Path destinationDir = Paths.get("/var/result");
Path destinationFile = destinationDir.resolve("data.txt");
Path largestFile = null;
long maxLineCount = 0;
try {
// существует ли каталог /var/result, если нет, создаем
if (!Files.exists(destinationDir)) {
Files.createDirectories(destinationDir);
} else {
// иначе чистим
try (DirectoryStream<Path> stream = Files.newDirectoryStream(destinationDir)) {
for (Path file : stream) {
Files.delete(file);
}
}
}
// поиск файла с наибольшим количеством строк в каталоге /var/data
try (DirectoryStream<Path> stream = Files.newDirectoryStream(sourceDir)) {
for (Path file : stream) {
if (Files.isRegularFile(file)) {
long lineCount = Files.lines(file).count();
if (lineCount > maxLineCount) {
maxLineCount = lineCount;
largestFile = file;
}
}
}
}
// копируем файл с наибольшим количеством строк в /var/result/data.txt
if (largestFile != null) {
Files.copy(largestFile, destinationFile, StandardCopyOption.REPLACE_EXISTING);
System.out.println("Файл " + largestFile + " скопирован в " + destinationFile);
} else {
System.out.println("В каталоге " + sourceDir + " нет файлов.");
}
} catch (IOException e) {
e.printStackTrace();
}
}
}

View File

@ -1,17 +0,0 @@
# Используем образ с Java 17
FROM bellsoft/liberica-openjdk-alpine:17.0.8
# Создаем директорию для исходных файлов
RUN mkdir /var/data
# Создаем директорию для приложения
WORKDIR /app
# Копируем файлы приложения в контейнер
COPY src /app/src
# Компилируем приложение
RUN javac /app/src/SecondService.java
# Определяем команду для запуска приложения
CMD ["java", "-cp", "/app/src", "SecondService"]

View File

@ -1,51 +0,0 @@
import java.io.*;
import java.nio.file.*;
import java.util.*;
public class SecondService {
// 2. Ищет наименьшее число из файла /var/result/data.txt и сохраняет его третью степень в /var/result/result.txt.
public static void main(String[] args) {
Path sourceFile = Paths.get("/var/result/data.txt");
Path destinationDir = Paths.get("/var/result");
Path destinationFile = destinationDir.resolve("result.txt");
try {
// создание /var/result, если не существует
if (!Files.exists(destinationDir)) {
Files.createDirectories(destinationDir);
}
// читаем числа из файла и находим наименьшее
List<Integer> numbers = new ArrayList<>();
try (BufferedReader reader = Files.newBufferedReader(sourceFile)) {
String line;
while ((line = reader.readLine()) != null) {
try {
numbers.add(Integer.parseInt(line.trim()));
} catch (NumberFormatException e) {
System.out.println("Некорректная строка: " + line);
}
}
}
if (!numbers.isEmpty()) {
// находим наименьшее число и его третью степень
int minNumber = Collections.min(numbers);
int minNumberCubed = (int) Math.pow(minNumber, 3);
// записываем результат в /var/result/result.txt
try (BufferedWriter writer = Files.newBufferedWriter(destinationFile)) {
writer.write(String.valueOf(minNumberCubed));
System.out.println("Третья степень наименьшего числа - " + minNumber + " (" + minNumberCubed +
") сохранена в " + destinationFile);
}
} else {
System.out.println("Файл " + sourceFile + " пуст или не содержит чисел.");
}
} catch (IOException e) {
e.printStackTrace();
}
}
}

View File

@ -1,41 +0,0 @@
# Лабораторная работа 3
## Описание
Данная лабораторная работа предназначена для настройки 3 сервисов — **Melon**, **Water** и прокси-сервер **Nginx**с использованием Docker Compose. **Melon** — сервис с дынями, где для каждого свой арбуз, а **Water** — сервис с водой и связанных с ними арбузами. **Nginx** — прокси-сервер, работающий на Unix-подобных операционных системах.
## Запуск проекта
1. Убедитесь, что у вас установлены **Docker** и **Docker Compose**.
2. Клонируйте репозиторий с данным проектом.
3. В командной строке перейдите в директорию с файлом `docker-compose.yml`.
4. Запустите команды:
```bash
docker-compose up -d
```
Эта команда запустит контейнеры в фоновом режиме.
5. После запуска:
- Melon будет доступен по адресу: [http://localhost:8080](http://localhost:8080)
- Water будет доступен по адресу: [http://localhost:8081](http://localhost:8081)
## Конфигурация
В файле `docker-compose.yml` определены следующие сервисы:
- **Melon**:
- Настроен на порту 8080 для веб-доступа.
- Обращается к **Nginx** для доступа к сервису **Water**.
- Реализует базовые CRUD-операции.
- **Water**:
- Настроен на порту 8081 для веб-доступа.
- Реализует базовые CRUD-операции.
- **Nginx**:
- Запускается из образа `nginx`.
- Используется для проксирования запросов.
## Остановка проекта
Для остановки контейнеров запустите:
```bash
docker-compose down
```
Это завершит работу всех контейнеров и освободит порты.
## Примечания
- При необходимости вы можете изменить порты или другие параметры, отредактировав файл `docker-compose.yml`.
- Ссылка на демонстрацию работы программы: https://vk.com/video215756667_456239453?list=ln-6zVfNOSwMQtpVWKkGe

View File

@ -1,25 +0,0 @@
version: '3.9'
services:
melon:
build: ./melon
ports:
- "8080:8080"
expose: # Указывает, какой порт будет открыт внутри контейнера
- 8080
water:
build: ./water
ports:
- "8081:8081"
expose: # Указывает, какой порт будет открыт внутри контейнера
- 8081
nginx:
image: nginx
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
depends_on:
- melon
- water

View File

@ -1,2 +0,0 @@
/mvnw text eol=lf
*.cmd text eol=crlf

View File

@ -1,33 +0,0 @@
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**/target/
!**/src/test/**/target/
### STS ###
.apt_generated
.classpath
.factorypath
.project
.settings
.springBeans
.sts4-cache
### IntelliJ IDEA ###
.idea
*.iws
*.iml
*.ipr
### NetBeans ###
/nbproject/private/
/nbbuild/
/dist/
/nbdist/
/.nb-gradle/
build/
!**/src/main/**/build/
!**/src/test/**/build/
### VS Code ###
.vscode/

View File

@ -1,19 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
wrapperVersion=3.3.2
distributionType=only-script
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.9/apache-maven-3.9.9-bin.zip

View File

@ -1,4 +0,0 @@
FROM bellsoft/liberica-openjdk-alpine:17.0.8
ADD target/melon-0.0.1-SNAPSHOT.jar /app/
CMD ["java", "-Xmx200m", "-jar", "/app/melon-0.0.1-SNAPSHOT.jar"]
WORKDIR /app

View File

@ -1,259 +0,0 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Apache Maven Wrapper startup batch script, version 3.3.2
#
# Optional ENV vars
# -----------------
# JAVA_HOME - location of a JDK home dir, required when download maven via java source
# MVNW_REPOURL - repo url base for downloading maven distribution
# MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
# MVNW_VERBOSE - true: enable verbose log; debug: trace the mvnw script; others: silence the output
# ----------------------------------------------------------------------------
set -euf
[ "${MVNW_VERBOSE-}" != debug ] || set -x
# OS specific support.
native_path() { printf %s\\n "$1"; }
case "$(uname)" in
CYGWIN* | MINGW*)
[ -z "${JAVA_HOME-}" ] || JAVA_HOME="$(cygpath --unix "$JAVA_HOME")"
native_path() { cygpath --path --windows "$1"; }
;;
esac
# set JAVACMD and JAVACCMD
set_java_home() {
# For Cygwin and MinGW, ensure paths are in Unix format before anything is touched
if [ -n "${JAVA_HOME-}" ]; then
if [ -x "$JAVA_HOME/jre/sh/java" ]; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
JAVACCMD="$JAVA_HOME/jre/sh/javac"
else
JAVACMD="$JAVA_HOME/bin/java"
JAVACCMD="$JAVA_HOME/bin/javac"
if [ ! -x "$JAVACMD" ] || [ ! -x "$JAVACCMD" ]; then
echo "The JAVA_HOME environment variable is not defined correctly, so mvnw cannot run." >&2
echo "JAVA_HOME is set to \"$JAVA_HOME\", but \"\$JAVA_HOME/bin/java\" or \"\$JAVA_HOME/bin/javac\" does not exist." >&2
return 1
fi
fi
else
JAVACMD="$(
'set' +e
'unset' -f command 2>/dev/null
'command' -v java
)" || :
JAVACCMD="$(
'set' +e
'unset' -f command 2>/dev/null
'command' -v javac
)" || :
if [ ! -x "${JAVACMD-}" ] || [ ! -x "${JAVACCMD-}" ]; then
echo "The java/javac command does not exist in PATH nor is JAVA_HOME set, so mvnw cannot run." >&2
return 1
fi
fi
}
# hash string like Java String::hashCode
hash_string() {
str="${1:-}" h=0
while [ -n "$str" ]; do
char="${str%"${str#?}"}"
h=$(((h * 31 + $(LC_CTYPE=C printf %d "'$char")) % 4294967296))
str="${str#?}"
done
printf %x\\n $h
}
verbose() { :; }
[ "${MVNW_VERBOSE-}" != true ] || verbose() { printf %s\\n "${1-}"; }
die() {
printf %s\\n "$1" >&2
exit 1
}
trim() {
# MWRAPPER-139:
# Trims trailing and leading whitespace, carriage returns, tabs, and linefeeds.
# Needed for removing poorly interpreted newline sequences when running in more
# exotic environments such as mingw bash on Windows.
printf "%s" "${1}" | tr -d '[:space:]'
}
# parse distributionUrl and optional distributionSha256Sum, requires .mvn/wrapper/maven-wrapper.properties
while IFS="=" read -r key value; do
case "${key-}" in
distributionUrl) distributionUrl=$(trim "${value-}") ;;
distributionSha256Sum) distributionSha256Sum=$(trim "${value-}") ;;
esac
done <"${0%/*}/.mvn/wrapper/maven-wrapper.properties"
[ -n "${distributionUrl-}" ] || die "cannot read distributionUrl property in ${0%/*}/.mvn/wrapper/maven-wrapper.properties"
case "${distributionUrl##*/}" in
maven-mvnd-*bin.*)
MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/
case "${PROCESSOR_ARCHITECTURE-}${PROCESSOR_ARCHITEW6432-}:$(uname -a)" in
*AMD64:CYGWIN* | *AMD64:MINGW*) distributionPlatform=windows-amd64 ;;
:Darwin*x86_64) distributionPlatform=darwin-amd64 ;;
:Darwin*arm64) distributionPlatform=darwin-aarch64 ;;
:Linux*x86_64*) distributionPlatform=linux-amd64 ;;
*)
echo "Cannot detect native platform for mvnd on $(uname)-$(uname -m), use pure java version" >&2
distributionPlatform=linux-amd64
;;
esac
distributionUrl="${distributionUrl%-bin.*}-$distributionPlatform.zip"
;;
maven-mvnd-*) MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/ ;;
*) MVN_CMD="mvn${0##*/mvnw}" _MVNW_REPO_PATTERN=/org/apache/maven/ ;;
esac
# apply MVNW_REPOURL and calculate MAVEN_HOME
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
[ -z "${MVNW_REPOURL-}" ] || distributionUrl="$MVNW_REPOURL$_MVNW_REPO_PATTERN${distributionUrl#*"$_MVNW_REPO_PATTERN"}"
distributionUrlName="${distributionUrl##*/}"
distributionUrlNameMain="${distributionUrlName%.*}"
distributionUrlNameMain="${distributionUrlNameMain%-bin}"
MAVEN_USER_HOME="${MAVEN_USER_HOME:-${HOME}/.m2}"
MAVEN_HOME="${MAVEN_USER_HOME}/wrapper/dists/${distributionUrlNameMain-}/$(hash_string "$distributionUrl")"
exec_maven() {
unset MVNW_VERBOSE MVNW_USERNAME MVNW_PASSWORD MVNW_REPOURL || :
exec "$MAVEN_HOME/bin/$MVN_CMD" "$@" || die "cannot exec $MAVEN_HOME/bin/$MVN_CMD"
}
if [ -d "$MAVEN_HOME" ]; then
verbose "found existing MAVEN_HOME at $MAVEN_HOME"
exec_maven "$@"
fi
case "${distributionUrl-}" in
*?-bin.zip | *?maven-mvnd-?*-?*.zip) ;;
*) die "distributionUrl is not valid, must match *-bin.zip or maven-mvnd-*.zip, but found '${distributionUrl-}'" ;;
esac
# prepare tmp dir
if TMP_DOWNLOAD_DIR="$(mktemp -d)" && [ -d "$TMP_DOWNLOAD_DIR" ]; then
clean() { rm -rf -- "$TMP_DOWNLOAD_DIR"; }
trap clean HUP INT TERM EXIT
else
die "cannot create temp dir"
fi
mkdir -p -- "${MAVEN_HOME%/*}"
# Download and Install Apache Maven
verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
verbose "Downloading from: $distributionUrl"
verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
# select .zip or .tar.gz
if ! command -v unzip >/dev/null; then
distributionUrl="${distributionUrl%.zip}.tar.gz"
distributionUrlName="${distributionUrl##*/}"
fi
# verbose opt
__MVNW_QUIET_WGET=--quiet __MVNW_QUIET_CURL=--silent __MVNW_QUIET_UNZIP=-q __MVNW_QUIET_TAR=''
[ "${MVNW_VERBOSE-}" != true ] || __MVNW_QUIET_WGET='' __MVNW_QUIET_CURL='' __MVNW_QUIET_UNZIP='' __MVNW_QUIET_TAR=v
# normalize http auth
case "${MVNW_PASSWORD:+has-password}" in
'') MVNW_USERNAME='' MVNW_PASSWORD='' ;;
has-password) [ -n "${MVNW_USERNAME-}" ] || MVNW_USERNAME='' MVNW_PASSWORD='' ;;
esac
if [ -z "${MVNW_USERNAME-}" ] && command -v wget >/dev/null; then
verbose "Found wget ... using wget"
wget ${__MVNW_QUIET_WGET:+"$__MVNW_QUIET_WGET"} "$distributionUrl" -O "$TMP_DOWNLOAD_DIR/$distributionUrlName" || die "wget: Failed to fetch $distributionUrl"
elif [ -z "${MVNW_USERNAME-}" ] && command -v curl >/dev/null; then
verbose "Found curl ... using curl"
curl ${__MVNW_QUIET_CURL:+"$__MVNW_QUIET_CURL"} -f -L -o "$TMP_DOWNLOAD_DIR/$distributionUrlName" "$distributionUrl" || die "curl: Failed to fetch $distributionUrl"
elif set_java_home; then
verbose "Falling back to use Java to download"
javaSource="$TMP_DOWNLOAD_DIR/Downloader.java"
targetZip="$TMP_DOWNLOAD_DIR/$distributionUrlName"
cat >"$javaSource" <<-END
public class Downloader extends java.net.Authenticator
{
protected java.net.PasswordAuthentication getPasswordAuthentication()
{
return new java.net.PasswordAuthentication( System.getenv( "MVNW_USERNAME" ), System.getenv( "MVNW_PASSWORD" ).toCharArray() );
}
public static void main( String[] args ) throws Exception
{
setDefault( new Downloader() );
java.nio.file.Files.copy( java.net.URI.create( args[0] ).toURL().openStream(), java.nio.file.Paths.get( args[1] ).toAbsolutePath().normalize() );
}
}
END
# For Cygwin/MinGW, switch paths to Windows format before running javac and java
verbose " - Compiling Downloader.java ..."
"$(native_path "$JAVACCMD")" "$(native_path "$javaSource")" || die "Failed to compile Downloader.java"
verbose " - Running Downloader.java ..."
"$(native_path "$JAVACMD")" -cp "$(native_path "$TMP_DOWNLOAD_DIR")" Downloader "$distributionUrl" "$(native_path "$targetZip")"
fi
# If specified, validate the SHA-256 sum of the Maven distribution zip file
if [ -n "${distributionSha256Sum-}" ]; then
distributionSha256Result=false
if [ "$MVN_CMD" = mvnd.sh ]; then
echo "Checksum validation is not supported for maven-mvnd." >&2
echo "Please disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
exit 1
elif command -v sha256sum >/dev/null; then
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | sha256sum -c >/dev/null 2>&1; then
distributionSha256Result=true
fi
elif command -v shasum >/dev/null; then
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | shasum -a 256 -c >/dev/null 2>&1; then
distributionSha256Result=true
fi
else
echo "Checksum validation was requested but neither 'sha256sum' or 'shasum' are available." >&2
echo "Please install either command, or disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
exit 1
fi
if [ $distributionSha256Result = false ]; then
echo "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised." >&2
echo "If you updated your Maven version, you need to update the specified distributionSha256Sum property." >&2
exit 1
fi
fi
# unzip and move
if command -v unzip >/dev/null; then
unzip ${__MVNW_QUIET_UNZIP:+"$__MVNW_QUIET_UNZIP"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -d "$TMP_DOWNLOAD_DIR" || die "failed to unzip"
else
tar xzf${__MVNW_QUIET_TAR:+"$__MVNW_QUIET_TAR"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -C "$TMP_DOWNLOAD_DIR" || die "failed to untar"
fi
printf %s\\n "$distributionUrl" >"$TMP_DOWNLOAD_DIR/$distributionUrlNameMain/mvnw.url"
mv -- "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain" "$MAVEN_HOME" || [ -d "$MAVEN_HOME" ] || die "fail to move MAVEN_HOME"
clean || :
exec_maven "$@"

View File

@ -1,149 +0,0 @@
<# : batch portion
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM http://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Apache Maven Wrapper startup batch script, version 3.3.2
@REM
@REM Optional ENV vars
@REM MVNW_REPOURL - repo url base for downloading maven distribution
@REM MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
@REM MVNW_VERBOSE - true: enable verbose log; others: silence the output
@REM ----------------------------------------------------------------------------
@IF "%__MVNW_ARG0_NAME__%"=="" (SET __MVNW_ARG0_NAME__=%~nx0)
@SET __MVNW_CMD__=
@SET __MVNW_ERROR__=
@SET __MVNW_PSMODULEP_SAVE=%PSModulePath%
@SET PSModulePath=
@FOR /F "usebackq tokens=1* delims==" %%A IN (`powershell -noprofile "& {$scriptDir='%~dp0'; $script='%__MVNW_ARG0_NAME__%'; icm -ScriptBlock ([Scriptblock]::Create((Get-Content -Raw '%~f0'))) -NoNewScope}"`) DO @(
IF "%%A"=="MVN_CMD" (set __MVNW_CMD__=%%B) ELSE IF "%%B"=="" (echo %%A) ELSE (echo %%A=%%B)
)
@SET PSModulePath=%__MVNW_PSMODULEP_SAVE%
@SET __MVNW_PSMODULEP_SAVE=
@SET __MVNW_ARG0_NAME__=
@SET MVNW_USERNAME=
@SET MVNW_PASSWORD=
@IF NOT "%__MVNW_CMD__%"=="" (%__MVNW_CMD__% %*)
@echo Cannot start maven from wrapper >&2 && exit /b 1
@GOTO :EOF
: end batch / begin powershell #>
$ErrorActionPreference = "Stop"
if ($env:MVNW_VERBOSE -eq "true") {
$VerbosePreference = "Continue"
}
# calculate distributionUrl, requires .mvn/wrapper/maven-wrapper.properties
$distributionUrl = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionUrl
if (!$distributionUrl) {
Write-Error "cannot read distributionUrl property in $scriptDir/.mvn/wrapper/maven-wrapper.properties"
}
switch -wildcard -casesensitive ( $($distributionUrl -replace '^.*/','') ) {
"maven-mvnd-*" {
$USE_MVND = $true
$distributionUrl = $distributionUrl -replace '-bin\.[^.]*$',"-windows-amd64.zip"
$MVN_CMD = "mvnd.cmd"
break
}
default {
$USE_MVND = $false
$MVN_CMD = $script -replace '^mvnw','mvn'
break
}
}
# apply MVNW_REPOURL and calculate MAVEN_HOME
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
if ($env:MVNW_REPOURL) {
$MVNW_REPO_PATTERN = if ($USE_MVND) { "/org/apache/maven/" } else { "/maven/mvnd/" }
$distributionUrl = "$env:MVNW_REPOURL$MVNW_REPO_PATTERN$($distributionUrl -replace '^.*'+$MVNW_REPO_PATTERN,'')"
}
$distributionUrlName = $distributionUrl -replace '^.*/',''
$distributionUrlNameMain = $distributionUrlName -replace '\.[^.]*$','' -replace '-bin$',''
$MAVEN_HOME_PARENT = "$HOME/.m2/wrapper/dists/$distributionUrlNameMain"
if ($env:MAVEN_USER_HOME) {
$MAVEN_HOME_PARENT = "$env:MAVEN_USER_HOME/wrapper/dists/$distributionUrlNameMain"
}
$MAVEN_HOME_NAME = ([System.Security.Cryptography.MD5]::Create().ComputeHash([byte[]][char[]]$distributionUrl) | ForEach-Object {$_.ToString("x2")}) -join ''
$MAVEN_HOME = "$MAVEN_HOME_PARENT/$MAVEN_HOME_NAME"
if (Test-Path -Path "$MAVEN_HOME" -PathType Container) {
Write-Verbose "found existing MAVEN_HOME at $MAVEN_HOME"
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"
exit $?
}
if (! $distributionUrlNameMain -or ($distributionUrlName -eq $distributionUrlNameMain)) {
Write-Error "distributionUrl is not valid, must end with *-bin.zip, but found $distributionUrl"
}
# prepare tmp dir
$TMP_DOWNLOAD_DIR_HOLDER = New-TemporaryFile
$TMP_DOWNLOAD_DIR = New-Item -Itemtype Directory -Path "$TMP_DOWNLOAD_DIR_HOLDER.dir"
$TMP_DOWNLOAD_DIR_HOLDER.Delete() | Out-Null
trap {
if ($TMP_DOWNLOAD_DIR.Exists) {
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
}
}
New-Item -Itemtype Directory -Path "$MAVEN_HOME_PARENT" -Force | Out-Null
# Download and Install Apache Maven
Write-Verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
Write-Verbose "Downloading from: $distributionUrl"
Write-Verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
$webclient = New-Object System.Net.WebClient
if ($env:MVNW_USERNAME -and $env:MVNW_PASSWORD) {
$webclient.Credentials = New-Object System.Net.NetworkCredential($env:MVNW_USERNAME, $env:MVNW_PASSWORD)
}
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webclient.DownloadFile($distributionUrl, "$TMP_DOWNLOAD_DIR/$distributionUrlName") | Out-Null
# If specified, validate the SHA-256 sum of the Maven distribution zip file
$distributionSha256Sum = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionSha256Sum
if ($distributionSha256Sum) {
if ($USE_MVND) {
Write-Error "Checksum validation is not supported for maven-mvnd. `nPlease disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties."
}
Import-Module $PSHOME\Modules\Microsoft.PowerShell.Utility -Function Get-FileHash
if ((Get-FileHash "$TMP_DOWNLOAD_DIR/$distributionUrlName" -Algorithm SHA256).Hash.ToLower() -ne $distributionSha256Sum) {
Write-Error "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised. If you updated your Maven version, you need to update the specified distributionSha256Sum property."
}
}
# unzip and move
Expand-Archive "$TMP_DOWNLOAD_DIR/$distributionUrlName" -DestinationPath "$TMP_DOWNLOAD_DIR" | Out-Null
Rename-Item -Path "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain" -NewName $MAVEN_HOME_NAME | Out-Null
try {
Move-Item -Path "$TMP_DOWNLOAD_DIR/$MAVEN_HOME_NAME" -Destination $MAVEN_HOME_PARENT | Out-Null
} catch {
if (! (Test-Path -Path "$MAVEN_HOME" -PathType Container)) {
Write-Error "fail to move MAVEN_HOME"
}
} finally {
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
}
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"

View File

@ -1,71 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.5</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>ru.ulstu</groupId>
<artifactId>melon</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>melon</name>
<description>Demo project for Spring Boot</description>
<url/>
<licenses>
<license/>
</licenses>
<developers>
<developer/>
</developers>
<scm>
<connection/>
<developerConnection/>
<tag/>
<url/>
</scm>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-starter-webmvc-ui</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@ -1,13 +0,0 @@
package ru.ulstu.melon;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class MelonApplication {
public static void main(String[] args) {
SpringApplication.run(MelonApplication.class, args);
}
}

View File

@ -1,13 +0,0 @@
package ru.ulstu.melon.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.web.client.RestTemplate;
@Configuration
public class RestTemplateConfig {
@Bean
public RestTemplate restTemplate() {
return new RestTemplate();
}
}

View File

@ -1,54 +0,0 @@
package ru.ulstu.melon.controller;
import lombok.RequiredArgsConstructor;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import ru.ulstu.melon.dto.CreateMelonDto;
import ru.ulstu.melon.dto.MelonDto;
import ru.ulstu.melon.dto.UpdateMelonDto;
import ru.ulstu.melon.model.Melon;
import ru.ulstu.melon.service.MelonService;
import java.util.Collection;
import java.util.UUID;
@RestController
@RequiredArgsConstructor
@RequestMapping("/melon")
public class MelonController {
private final MelonService melonService;
@GetMapping
public ResponseEntity<Collection<Melon>> get() {
return new ResponseEntity<>(melonService.get(), HttpStatus.OK);
}
@GetMapping("/{id}")
public ResponseEntity<MelonDto> get(@PathVariable UUID id) {
return new ResponseEntity<>(melonService.get(id), HttpStatus.OK);
}
@PostMapping
public ResponseEntity<MelonDto> add(@RequestBody CreateMelonDto dto) {
return new ResponseEntity<>(melonService.add(dto), HttpStatus.OK);
}
@PutMapping("/{id}")
public ResponseEntity<MelonDto> update(@PathVariable UUID id, @RequestBody UpdateMelonDto dto) {
return new ResponseEntity<>(melonService.update(id, dto), HttpStatus.OK);
}
@DeleteMapping("/{id}")
public ResponseEntity<Void> delete(@PathVariable UUID id) {
melonService.delete(id);
return new ResponseEntity<>(HttpStatus.OK);
}
}

View File

@ -1,14 +0,0 @@
package ru.ulstu.melon.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
import java.util.UUID;
@AllArgsConstructor
@Getter
public class CreateMelonDto {
private Boolean isRipe;
private Double weight;
private UUID waterMelonId;
}

View File

@ -1,28 +0,0 @@
package ru.ulstu.melon.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.Setter;
import ru.ulstu.melon.model.Melon;
import ru.ulstu.melon.model.Water;
import java.util.UUID;
@AllArgsConstructor
@Getter
@Setter
public class MelonDto {
private UUID id;
private Boolean isRipe;
private Double weight;
private UUID waterMelonId;
private WaterDto waterMelon;
public MelonDto(Melon melon, Water water) {
this.id = melon.getId();
this.isRipe = melon.getIsRipe();
this.weight = melon.getWeight();
this.waterMelonId = melon.getWaterMelonId();
this.waterMelon = new WaterDto(water);
}
}

View File

@ -1,11 +0,0 @@
package ru.ulstu.melon.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
@AllArgsConstructor
@Getter
public class UpdateMelonDto {
private Boolean isRipe;
private Double weight;
}

View File

@ -1,23 +0,0 @@
package ru.ulstu.melon.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.Setter;
import ru.ulstu.melon.model.Water;
import java.util.UUID;
@Getter
@Setter
@AllArgsConstructor
public class WaterDto {
private UUID id;
private Boolean isSweetBottom;
private Double volume;
public WaterDto(Water water) {
this.id = water.getId();
this.isSweetBottom = water.getIsSweetBottom();
this.volume = water.getVolume();
}
}

View File

@ -1,19 +0,0 @@
package ru.ulstu.melon.model;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.util.UUID;
@AllArgsConstructor
@NoArgsConstructor
@Getter
@Setter
public class Melon {
private UUID id;
private Boolean isRipe;
private Double weight;
private UUID waterMelonId;
}

View File

@ -1,18 +0,0 @@
package ru.ulstu.melon.model;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import java.util.List;
import java.util.UUID;
@AllArgsConstructor
@NoArgsConstructor
@Getter
public class Water {
private UUID id;
private Boolean isSweetBottom;
private Double volume;
private List<Melon> waterMelons;
}

View File

@ -1,88 +0,0 @@
package ru.ulstu.melon.service;
import lombok.RequiredArgsConstructor;
import org.springframework.http.HttpStatus;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClientException;
import org.springframework.web.client.RestTemplate;
import org.springframework.web.server.ResponseStatusException;
import ru.ulstu.melon.dto.CreateMelonDto;
import ru.ulstu.melon.dto.MelonDto;
import ru.ulstu.melon.dto.UpdateMelonDto;
import ru.ulstu.melon.model.Melon;
import ru.ulstu.melon.model.Water;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.UUID;
@Service
@RequiredArgsConstructor
public class MelonService {
private final Map<UUID, Melon> melons = new HashMap<>();
private final RestTemplate restTemplate;
private static final String WATER_SERVICE_PATH = "http://nginx/water/water/";
public Collection<Melon> get() {
return melons.values();
}
public MelonDto get(UUID id) {
if (!melons.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Melon not found");
}
final Melon melon = melons.get(id);
return new MelonDto(melon, getWater(melon.getWaterMelonId()));
}
public MelonDto add(CreateMelonDto dto) {
Melon melon = new Melon(UUID.randomUUID(),
dto.getIsRipe(),
dto.getWeight(),
dto.getWaterMelonId());
melons.put(melon.getId(), melon);
Water water;
try {
String baseUrl = WATER_SERVICE_PATH + melon.getWaterMelonId() + "/addMelon";
water = restTemplate.postForObject(
baseUrl,
melon,
Water.class
);
} catch (RestClientException e) {
throw new RuntimeException("Failed to add melon to waterMelons: " + e.getMessage(), e);
}
return new MelonDto(melon, water);
}
public MelonDto update(UUID id, UpdateMelonDto dto) {
if (!melons.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Melon not found");
}
Melon melon = melons.get(id);
if (dto.getWeight() != null)
melon.setWeight(dto.getWeight());
if (dto.getIsRipe() != null)
melon.setIsRipe(dto.getIsRipe());
return new MelonDto(melon, getWater(melon.getWaterMelonId()));
}
public void delete(UUID id) {
if (!melons.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Melon not found");
}
melons.remove(id);
}
private Water getWater(UUID id) {
Water water;
try {
String baseUrl = WATER_SERVICE_PATH + id;
water = restTemplate.getForEntity(baseUrl, Water.class).getBody();
} catch (RestClientException e) {
throw new RuntimeException("Failed get waterMelon for melon: " + e.getMessage());
}
return water;
}
}

View File

@ -1 +0,0 @@
spring.application.name=melon

View File

@ -1,13 +0,0 @@
package ru.ulstu.melon;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
@SpringBootTest
class MelonApplicationTests {
@Test
void contextLoads() {
}
}

View File

@ -1,19 +0,0 @@
events {
worker_connections 1024;
}
http {
server {
listen 80;
listen [::]:80;
server_name localhost;
location /melon/ {
proxy_pass http://melon:8080/;
}
location /water/ {
proxy_pass http://water:8081/;
}
}
}

View File

@ -1,2 +0,0 @@
/mvnw text eol=lf
*.cmd text eol=crlf

View File

@ -1,33 +0,0 @@
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**/target/
!**/src/test/**/target/
### STS ###
.apt_generated
.classpath
.factorypath
.project
.settings
.springBeans
.sts4-cache
### IntelliJ IDEA ###
.idea
*.iws
*.iml
*.ipr
### NetBeans ###
/nbproject/private/
/nbbuild/
/dist/
/nbdist/
/.nb-gradle/
build/
!**/src/main/**/build/
!**/src/test/**/build/
### VS Code ###
.vscode/

View File

@ -1,19 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
wrapperVersion=3.3.2
distributionType=only-script
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.9/apache-maven-3.9.9-bin.zip

View File

@ -1,4 +0,0 @@
FROM bellsoft/liberica-openjdk-alpine:17.0.8
ADD target/water-0.0.1-SNAPSHOT.jar /app/
CMD ["java", "-Xmx200m", "-jar", "/app/water-0.0.1-SNAPSHOT.jar"]
WORKDIR /app

View File

@ -1,259 +0,0 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Apache Maven Wrapper startup batch script, version 3.3.2
#
# Optional ENV vars
# -----------------
# JAVA_HOME - location of a JDK home dir, required when download maven via java source
# MVNW_REPOURL - repo url base for downloading maven distribution
# MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
# MVNW_VERBOSE - true: enable verbose log; debug: trace the mvnw script; others: silence the output
# ----------------------------------------------------------------------------
set -euf
[ "${MVNW_VERBOSE-}" != debug ] || set -x
# OS specific support.
native_path() { printf %s\\n "$1"; }
case "$(uname)" in
CYGWIN* | MINGW*)
[ -z "${JAVA_HOME-}" ] || JAVA_HOME="$(cygpath --unix "$JAVA_HOME")"
native_path() { cygpath --path --windows "$1"; }
;;
esac
# set JAVACMD and JAVACCMD
set_java_home() {
# For Cygwin and MinGW, ensure paths are in Unix format before anything is touched
if [ -n "${JAVA_HOME-}" ]; then
if [ -x "$JAVA_HOME/jre/sh/java" ]; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
JAVACCMD="$JAVA_HOME/jre/sh/javac"
else
JAVACMD="$JAVA_HOME/bin/java"
JAVACCMD="$JAVA_HOME/bin/javac"
if [ ! -x "$JAVACMD" ] || [ ! -x "$JAVACCMD" ]; then
echo "The JAVA_HOME environment variable is not defined correctly, so mvnw cannot run." >&2
echo "JAVA_HOME is set to \"$JAVA_HOME\", but \"\$JAVA_HOME/bin/java\" or \"\$JAVA_HOME/bin/javac\" does not exist." >&2
return 1
fi
fi
else
JAVACMD="$(
'set' +e
'unset' -f command 2>/dev/null
'command' -v java
)" || :
JAVACCMD="$(
'set' +e
'unset' -f command 2>/dev/null
'command' -v javac
)" || :
if [ ! -x "${JAVACMD-}" ] || [ ! -x "${JAVACCMD-}" ]; then
echo "The java/javac command does not exist in PATH nor is JAVA_HOME set, so mvnw cannot run." >&2
return 1
fi
fi
}
# hash string like Java String::hashCode
hash_string() {
str="${1:-}" h=0
while [ -n "$str" ]; do
char="${str%"${str#?}"}"
h=$(((h * 31 + $(LC_CTYPE=C printf %d "'$char")) % 4294967296))
str="${str#?}"
done
printf %x\\n $h
}
verbose() { :; }
[ "${MVNW_VERBOSE-}" != true ] || verbose() { printf %s\\n "${1-}"; }
die() {
printf %s\\n "$1" >&2
exit 1
}
trim() {
# MWRAPPER-139:
# Trims trailing and leading whitespace, carriage returns, tabs, and linefeeds.
# Needed for removing poorly interpreted newline sequences when running in more
# exotic environments such as mingw bash on Windows.
printf "%s" "${1}" | tr -d '[:space:]'
}
# parse distributionUrl and optional distributionSha256Sum, requires .mvn/wrapper/maven-wrapper.properties
while IFS="=" read -r key value; do
case "${key-}" in
distributionUrl) distributionUrl=$(trim "${value-}") ;;
distributionSha256Sum) distributionSha256Sum=$(trim "${value-}") ;;
esac
done <"${0%/*}/.mvn/wrapper/maven-wrapper.properties"
[ -n "${distributionUrl-}" ] || die "cannot read distributionUrl property in ${0%/*}/.mvn/wrapper/maven-wrapper.properties"
case "${distributionUrl##*/}" in
maven-mvnd-*bin.*)
MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/
case "${PROCESSOR_ARCHITECTURE-}${PROCESSOR_ARCHITEW6432-}:$(uname -a)" in
*AMD64:CYGWIN* | *AMD64:MINGW*) distributionPlatform=windows-amd64 ;;
:Darwin*x86_64) distributionPlatform=darwin-amd64 ;;
:Darwin*arm64) distributionPlatform=darwin-aarch64 ;;
:Linux*x86_64*) distributionPlatform=linux-amd64 ;;
*)
echo "Cannot detect native platform for mvnd on $(uname)-$(uname -m), use pure java version" >&2
distributionPlatform=linux-amd64
;;
esac
distributionUrl="${distributionUrl%-bin.*}-$distributionPlatform.zip"
;;
maven-mvnd-*) MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/ ;;
*) MVN_CMD="mvn${0##*/mvnw}" _MVNW_REPO_PATTERN=/org/apache/maven/ ;;
esac
# apply MVNW_REPOURL and calculate MAVEN_HOME
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
[ -z "${MVNW_REPOURL-}" ] || distributionUrl="$MVNW_REPOURL$_MVNW_REPO_PATTERN${distributionUrl#*"$_MVNW_REPO_PATTERN"}"
distributionUrlName="${distributionUrl##*/}"
distributionUrlNameMain="${distributionUrlName%.*}"
distributionUrlNameMain="${distributionUrlNameMain%-bin}"
MAVEN_USER_HOME="${MAVEN_USER_HOME:-${HOME}/.m2}"
MAVEN_HOME="${MAVEN_USER_HOME}/wrapper/dists/${distributionUrlNameMain-}/$(hash_string "$distributionUrl")"
exec_maven() {
unset MVNW_VERBOSE MVNW_USERNAME MVNW_PASSWORD MVNW_REPOURL || :
exec "$MAVEN_HOME/bin/$MVN_CMD" "$@" || die "cannot exec $MAVEN_HOME/bin/$MVN_CMD"
}
if [ -d "$MAVEN_HOME" ]; then
verbose "found existing MAVEN_HOME at $MAVEN_HOME"
exec_maven "$@"
fi
case "${distributionUrl-}" in
*?-bin.zip | *?maven-mvnd-?*-?*.zip) ;;
*) die "distributionUrl is not valid, must match *-bin.zip or maven-mvnd-*.zip, but found '${distributionUrl-}'" ;;
esac
# prepare tmp dir
if TMP_DOWNLOAD_DIR="$(mktemp -d)" && [ -d "$TMP_DOWNLOAD_DIR" ]; then
clean() { rm -rf -- "$TMP_DOWNLOAD_DIR"; }
trap clean HUP INT TERM EXIT
else
die "cannot create temp dir"
fi
mkdir -p -- "${MAVEN_HOME%/*}"
# Download and Install Apache Maven
verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
verbose "Downloading from: $distributionUrl"
verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
# select .zip or .tar.gz
if ! command -v unzip >/dev/null; then
distributionUrl="${distributionUrl%.zip}.tar.gz"
distributionUrlName="${distributionUrl##*/}"
fi
# verbose opt
__MVNW_QUIET_WGET=--quiet __MVNW_QUIET_CURL=--silent __MVNW_QUIET_UNZIP=-q __MVNW_QUIET_TAR=''
[ "${MVNW_VERBOSE-}" != true ] || __MVNW_QUIET_WGET='' __MVNW_QUIET_CURL='' __MVNW_QUIET_UNZIP='' __MVNW_QUIET_TAR=v
# normalize http auth
case "${MVNW_PASSWORD:+has-password}" in
'') MVNW_USERNAME='' MVNW_PASSWORD='' ;;
has-password) [ -n "${MVNW_USERNAME-}" ] || MVNW_USERNAME='' MVNW_PASSWORD='' ;;
esac
if [ -z "${MVNW_USERNAME-}" ] && command -v wget >/dev/null; then
verbose "Found wget ... using wget"
wget ${__MVNW_QUIET_WGET:+"$__MVNW_QUIET_WGET"} "$distributionUrl" -O "$TMP_DOWNLOAD_DIR/$distributionUrlName" || die "wget: Failed to fetch $distributionUrl"
elif [ -z "${MVNW_USERNAME-}" ] && command -v curl >/dev/null; then
verbose "Found curl ... using curl"
curl ${__MVNW_QUIET_CURL:+"$__MVNW_QUIET_CURL"} -f -L -o "$TMP_DOWNLOAD_DIR/$distributionUrlName" "$distributionUrl" || die "curl: Failed to fetch $distributionUrl"
elif set_java_home; then
verbose "Falling back to use Java to download"
javaSource="$TMP_DOWNLOAD_DIR/Downloader.java"
targetZip="$TMP_DOWNLOAD_DIR/$distributionUrlName"
cat >"$javaSource" <<-END
public class Downloader extends java.net.Authenticator
{
protected java.net.PasswordAuthentication getPasswordAuthentication()
{
return new java.net.PasswordAuthentication( System.getenv( "MVNW_USERNAME" ), System.getenv( "MVNW_PASSWORD" ).toCharArray() );
}
public static void main( String[] args ) throws Exception
{
setDefault( new Downloader() );
java.nio.file.Files.copy( java.net.URI.create( args[0] ).toURL().openStream(), java.nio.file.Paths.get( args[1] ).toAbsolutePath().normalize() );
}
}
END
# For Cygwin/MinGW, switch paths to Windows format before running javac and java
verbose " - Compiling Downloader.java ..."
"$(native_path "$JAVACCMD")" "$(native_path "$javaSource")" || die "Failed to compile Downloader.java"
verbose " - Running Downloader.java ..."
"$(native_path "$JAVACMD")" -cp "$(native_path "$TMP_DOWNLOAD_DIR")" Downloader "$distributionUrl" "$(native_path "$targetZip")"
fi
# If specified, validate the SHA-256 sum of the Maven distribution zip file
if [ -n "${distributionSha256Sum-}" ]; then
distributionSha256Result=false
if [ "$MVN_CMD" = mvnd.sh ]; then
echo "Checksum validation is not supported for maven-mvnd." >&2
echo "Please disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
exit 1
elif command -v sha256sum >/dev/null; then
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | sha256sum -c >/dev/null 2>&1; then
distributionSha256Result=true
fi
elif command -v shasum >/dev/null; then
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | shasum -a 256 -c >/dev/null 2>&1; then
distributionSha256Result=true
fi
else
echo "Checksum validation was requested but neither 'sha256sum' or 'shasum' are available." >&2
echo "Please install either command, or disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
exit 1
fi
if [ $distributionSha256Result = false ]; then
echo "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised." >&2
echo "If you updated your Maven version, you need to update the specified distributionSha256Sum property." >&2
exit 1
fi
fi
# unzip and move
if command -v unzip >/dev/null; then
unzip ${__MVNW_QUIET_UNZIP:+"$__MVNW_QUIET_UNZIP"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -d "$TMP_DOWNLOAD_DIR" || die "failed to unzip"
else
tar xzf${__MVNW_QUIET_TAR:+"$__MVNW_QUIET_TAR"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -C "$TMP_DOWNLOAD_DIR" || die "failed to untar"
fi
printf %s\\n "$distributionUrl" >"$TMP_DOWNLOAD_DIR/$distributionUrlNameMain/mvnw.url"
mv -- "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain" "$MAVEN_HOME" || [ -d "$MAVEN_HOME" ] || die "fail to move MAVEN_HOME"
clean || :
exec_maven "$@"

View File

@ -1,149 +0,0 @@
<# : batch portion
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM http://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Apache Maven Wrapper startup batch script, version 3.3.2
@REM
@REM Optional ENV vars
@REM MVNW_REPOURL - repo url base for downloading maven distribution
@REM MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
@REM MVNW_VERBOSE - true: enable verbose log; others: silence the output
@REM ----------------------------------------------------------------------------
@IF "%__MVNW_ARG0_NAME__%"=="" (SET __MVNW_ARG0_NAME__=%~nx0)
@SET __MVNW_CMD__=
@SET __MVNW_ERROR__=
@SET __MVNW_PSMODULEP_SAVE=%PSModulePath%
@SET PSModulePath=
@FOR /F "usebackq tokens=1* delims==" %%A IN (`powershell -noprofile "& {$scriptDir='%~dp0'; $script='%__MVNW_ARG0_NAME__%'; icm -ScriptBlock ([Scriptblock]::Create((Get-Content -Raw '%~f0'))) -NoNewScope}"`) DO @(
IF "%%A"=="MVN_CMD" (set __MVNW_CMD__=%%B) ELSE IF "%%B"=="" (echo %%A) ELSE (echo %%A=%%B)
)
@SET PSModulePath=%__MVNW_PSMODULEP_SAVE%
@SET __MVNW_PSMODULEP_SAVE=
@SET __MVNW_ARG0_NAME__=
@SET MVNW_USERNAME=
@SET MVNW_PASSWORD=
@IF NOT "%__MVNW_CMD__%"=="" (%__MVNW_CMD__% %*)
@echo Cannot start maven from wrapper >&2 && exit /b 1
@GOTO :EOF
: end batch / begin powershell #>
$ErrorActionPreference = "Stop"
if ($env:MVNW_VERBOSE -eq "true") {
$VerbosePreference = "Continue"
}
# calculate distributionUrl, requires .mvn/wrapper/maven-wrapper.properties
$distributionUrl = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionUrl
if (!$distributionUrl) {
Write-Error "cannot read distributionUrl property in $scriptDir/.mvn/wrapper/maven-wrapper.properties"
}
switch -wildcard -casesensitive ( $($distributionUrl -replace '^.*/','') ) {
"maven-mvnd-*" {
$USE_MVND = $true
$distributionUrl = $distributionUrl -replace '-bin\.[^.]*$',"-windows-amd64.zip"
$MVN_CMD = "mvnd.cmd"
break
}
default {
$USE_MVND = $false
$MVN_CMD = $script -replace '^mvnw','mvn'
break
}
}
# apply MVNW_REPOURL and calculate MAVEN_HOME
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
if ($env:MVNW_REPOURL) {
$MVNW_REPO_PATTERN = if ($USE_MVND) { "/org/apache/maven/" } else { "/maven/mvnd/" }
$distributionUrl = "$env:MVNW_REPOURL$MVNW_REPO_PATTERN$($distributionUrl -replace '^.*'+$MVNW_REPO_PATTERN,'')"
}
$distributionUrlName = $distributionUrl -replace '^.*/',''
$distributionUrlNameMain = $distributionUrlName -replace '\.[^.]*$','' -replace '-bin$',''
$MAVEN_HOME_PARENT = "$HOME/.m2/wrapper/dists/$distributionUrlNameMain"
if ($env:MAVEN_USER_HOME) {
$MAVEN_HOME_PARENT = "$env:MAVEN_USER_HOME/wrapper/dists/$distributionUrlNameMain"
}
$MAVEN_HOME_NAME = ([System.Security.Cryptography.MD5]::Create().ComputeHash([byte[]][char[]]$distributionUrl) | ForEach-Object {$_.ToString("x2")}) -join ''
$MAVEN_HOME = "$MAVEN_HOME_PARENT/$MAVEN_HOME_NAME"
if (Test-Path -Path "$MAVEN_HOME" -PathType Container) {
Write-Verbose "found existing MAVEN_HOME at $MAVEN_HOME"
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"
exit $?
}
if (! $distributionUrlNameMain -or ($distributionUrlName -eq $distributionUrlNameMain)) {
Write-Error "distributionUrl is not valid, must end with *-bin.zip, but found $distributionUrl"
}
# prepare tmp dir
$TMP_DOWNLOAD_DIR_HOLDER = New-TemporaryFile
$TMP_DOWNLOAD_DIR = New-Item -Itemtype Directory -Path "$TMP_DOWNLOAD_DIR_HOLDER.dir"
$TMP_DOWNLOAD_DIR_HOLDER.Delete() | Out-Null
trap {
if ($TMP_DOWNLOAD_DIR.Exists) {
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
}
}
New-Item -Itemtype Directory -Path "$MAVEN_HOME_PARENT" -Force | Out-Null
# Download and Install Apache Maven
Write-Verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
Write-Verbose "Downloading from: $distributionUrl"
Write-Verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
$webclient = New-Object System.Net.WebClient
if ($env:MVNW_USERNAME -and $env:MVNW_PASSWORD) {
$webclient.Credentials = New-Object System.Net.NetworkCredential($env:MVNW_USERNAME, $env:MVNW_PASSWORD)
}
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webclient.DownloadFile($distributionUrl, "$TMP_DOWNLOAD_DIR/$distributionUrlName") | Out-Null
# If specified, validate the SHA-256 sum of the Maven distribution zip file
$distributionSha256Sum = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionSha256Sum
if ($distributionSha256Sum) {
if ($USE_MVND) {
Write-Error "Checksum validation is not supported for maven-mvnd. `nPlease disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties."
}
Import-Module $PSHOME\Modules\Microsoft.PowerShell.Utility -Function Get-FileHash
if ((Get-FileHash "$TMP_DOWNLOAD_DIR/$distributionUrlName" -Algorithm SHA256).Hash.ToLower() -ne $distributionSha256Sum) {
Write-Error "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised. If you updated your Maven version, you need to update the specified distributionSha256Sum property."
}
}
# unzip and move
Expand-Archive "$TMP_DOWNLOAD_DIR/$distributionUrlName" -DestinationPath "$TMP_DOWNLOAD_DIR" | Out-Null
Rename-Item -Path "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain" -NewName $MAVEN_HOME_NAME | Out-Null
try {
Move-Item -Path "$TMP_DOWNLOAD_DIR/$MAVEN_HOME_NAME" -Destination $MAVEN_HOME_PARENT | Out-Null
} catch {
if (! (Test-Path -Path "$MAVEN_HOME" -PathType Container)) {
Write-Error "fail to move MAVEN_HOME"
}
} finally {
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
}
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"

View File

@ -1,71 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.5</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>ru.ulstu</groupId>
<artifactId>water</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>water</name>
<description>Demo project for Spring Boot</description>
<url/>
<licenses>
<license/>
</licenses>
<developers>
<developer/>
</developers>
<scm>
<connection/>
<developerConnection/>
<tag/>
<url/>
</scm>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-starter-webmvc-ui</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@ -1,13 +0,0 @@
package ru.ulstu.water;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class WaterApplication {
public static void main(String[] args) {
SpringApplication.run(WaterApplication.class, args);
}
}

View File

@ -1,58 +0,0 @@
package ru.ulstu.water.controller;
import lombok.RequiredArgsConstructor;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import ru.ulstu.water.dto.WaterDto;
import ru.ulstu.water.model.Melon;
import ru.ulstu.water.model.Water;
import ru.ulstu.water.service.WaterService;
import java.util.Collection;
import java.util.UUID;
@RestController
@RequiredArgsConstructor
@RequestMapping("/water")
public class WaterController {
private final WaterService waterService;
@GetMapping
public ResponseEntity<Collection<Water>> get() {
return new ResponseEntity<>(waterService.get(), HttpStatus.OK);
}
@GetMapping("/{id}")
public ResponseEntity<Water> get(@PathVariable UUID id) {
return new ResponseEntity<>(waterService.get(id), HttpStatus.OK);
}
@PostMapping
public ResponseEntity<Water> add(@RequestBody WaterDto dto) {
return new ResponseEntity<>(waterService.add(dto), HttpStatus.OK);
}
@PutMapping("/{id}")
public ResponseEntity<Water> update(@PathVariable UUID id, @RequestBody WaterDto dto) {
return new ResponseEntity<>(waterService.update(id, dto), HttpStatus.OK);
}
@DeleteMapping("/{id}")
public ResponseEntity<Void> delete(@PathVariable UUID id) {
waterService.delete(id);
return new ResponseEntity<>(HttpStatus.OK);
}
@PostMapping("/{id}/addMelon")
public ResponseEntity<Water> addMelon(@PathVariable UUID id, @RequestBody Melon melon) {
return new ResponseEntity<>(waterService.addMelon(id, melon), HttpStatus.OK);
}
}

View File

@ -1,11 +0,0 @@
package ru.ulstu.water.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
@AllArgsConstructor
@Getter
public class WaterDto {
private Boolean isSweetBottom;
private Double volume;
}

View File

@ -1,19 +0,0 @@
package ru.ulstu.water.model;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.util.UUID;
@AllArgsConstructor
@NoArgsConstructor
@Getter
@Setter
public class Melon {
private UUID id;
private Boolean isRipe;
private Double weight;
private UUID waterMelonId;
}

View File

@ -1,20 +0,0 @@
package ru.ulstu.water.model;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.util.List;
import java.util.UUID;
@AllArgsConstructor
@NoArgsConstructor
@Getter
@Setter
public class Water {
private UUID id;
private Boolean isSweetBottom;
private Double volume;
private List<Melon> waterMelons;
}

View File

@ -1,64 +0,0 @@
package ru.ulstu.water.service;
import org.springframework.http.HttpStatus;
import org.springframework.stereotype.Service;
import org.springframework.web.server.ResponseStatusException;
import ru.ulstu.water.dto.WaterDto;
import ru.ulstu.water.model.Melon;
import ru.ulstu.water.model.Water;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.UUID;
@Service
public class WaterService {
private final Map<UUID, Water> waters = new HashMap<>();
public Collection<Water> get() {
return waters.values();
}
public Water get(UUID id) {
if (!waters.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Water not found");
}
return waters.get(id);
}
public Water add(WaterDto dto) {
Water water = new Water(UUID.randomUUID(),
dto.getIsSweetBottom(),
dto.getVolume(),
new ArrayList<>());
waters.put(water.getId(), water);
return water;
}
public Water update(UUID id, WaterDto dto) {
Water water = waters.get(id);
if (dto.getIsSweetBottom() != null)
water.setIsSweetBottom(dto.getIsSweetBottom());
if (dto.getVolume() != null)
water.setVolume(dto.getVolume());
return water;
}
public void delete(UUID id) {
if (!waters.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Water not found");
}
waters.remove(id);
}
public Water addMelon(UUID id, Melon melon) {
if (!waters.containsKey(id)) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Water not found");
}
final Water water = waters.get(id);
water.getWaterMelons().add(melon);
return water;
}
}

View File

@ -1,2 +0,0 @@
spring.application.name=water
server.port=8081

View File

@ -1,13 +0,0 @@
package ru.ulstu.water;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
@SpringBootTest
class WaterApplicationTests {
@Test
void contextLoads() {
}
}

View File

@ -0,0 +1,29 @@
# Лабораторная работа 8 - Про устройство распределенных систем
## ПИбд-42 || Алейкин Артем
### Ответы на вопросы
#### Зачем сложные системы (например, социальная сеть ВКонтакте) пишутся в "распределенном" стиле, где каждое отдельное приложение (или сервис) функционально выполняет только ограниченный спектр задач?
В первую очередь это удобно, эффективно и безопасно. Например при распредленном стиле, взаимодействие между частями кода(по факту его разделили на микросервисы) сильно ослабляется, за счёт чего мы получаем модульность.
> Модульность сама по себе даёт множество преимуществ.
> > 1. Возможность нанимать совершенно разные команды сотрудников для разработки конкретных модулей. То есть каждый модуль(сервис) может развиваться и масштабироваться независимо от остальных.
> > 2. Дебагинг, отслеживание ошибок, контролирование кода всё это проще, ведь мы сразу понимаем в каком модуле приложения находится нерабочий код, чтобы оперативно назначить разработчика для исправления ошибок.
> > 3. Отказоустойчивость - в том же вконтакте может упасть, к примеру, сервис музыки, но это означает, что только он и будет нерабочим, в том время как остальные сервисы даже не знают об этом и продолжают работать в штатном режиме.
#### Для чего были созданы системы оркестрации приложений? Каким образом они упрощают / усложняют разработку и сопровождение распределенных систем?
Нужны они для возможности централизованного запуска и управления всеми сервисами приложения.
Так же они значительно упрощают разворачивание своего приложения на чужих машинах.
#### Для чего нужны очереди обработки сообщений и что может подразумеваться под сообщениями?
Очереди обработки сообщений нужны для возможности реализации "общения" сервисов между собой.
Некоторые сервисы могут зависеть от результатов других сервисов, для этого нужно "уведомить" второй сервис, о том, что он должен что-то сделать.
А также очереди ответственны за сохранность переданных данных.
#### Какие преимущества и недостатки распределенных приложений существуют на Ваш взгляд?
Я думаю, что о преимуществах достаточно подробно расписал это в первом пункте, поэтому здесь будут упомянут лишь недостатки.
К недостаткам я могу отнести следующие факторы:
> 1. Ослабление централизованного контроля - имеется в виду, что каждый модуль(сервис) более суверенный и мы далеко не всегда можем знать (имеется в виду, если проект разрабатывается командой людей, а не одним человеком), что там происходит. Что-то вроде черного ящика - даём данные и забираем результат, а процесс получения результата нам неизвестен.
> 2. Увеличение сложности разработки проекта в целом - требуется больше времени на проектирование архитектуры приложения, появляются накладные расходы на написание кода прослоек между сервисами, так же стоит отметить сложность отлавливание распредленных ошибок.
#### Целесообразно ли в сложную распределенную систему внедрять параллельные вычисления? Приведите примеры, когда это действительно нужно, а когда нет.
Параллельные вычисления целесообразны, когда задача требует обработки больших объёмов данных или сложных вычислений. Например, анализ пользовательской активности, обучение моделей машинного обучения или обработка видео выгодно распределять между узлами. Но если задача не требует высокой производительности и её легко решить последовательно, внедрение параллелизма может усложнить систему без ощутимой пользы.

View File

@ -1,15 +0,0 @@
services:
vacancies-service:
build: ./vacancy-service
ports:
- "5000:5000"
resumes-service:
build: ./resume-service
ports:
- "5001:5001"
gateway:
image: nginx:latest
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/conf.d/default.conf

View File

@ -1,19 +0,0 @@
upstream vacancies {
server vacancies-service:5000;
}
upstream resumes {
server resumes-service:5001;
}
server {
listen 80;
location /vacancies {
proxy_pass http://vacancies;
}
location /resumes {
proxy_pass http://resumes;
}
}

View File

@ -1,35 +0,0 @@
## Лабораторная работа №3 ПИбд-42 Артамоновой Татьяны
### Цель:
* Реализовать два микросервиса, которые взаимодействуют друг с другом через синхронный обмен сообщениями (HTTP-запросы). Для доступа к микросервисам используется шлюз Nginx, реализованный с помощью Docker Compose.
### Технологии:
* Python: Язык программирования для реализации микросервисов.
* Flask: Фреймворк Python для создания веб-приложений, использован для создания REST API микросервисов.
* requests: Библиотека Python для отправки HTTP-запросов, использован для синхронного обмена сообщениями между микросервисами.
* flask_cors: Расширение Flask, которое позволяет микросервисам получать доступ к данным из других доменов.
* Docker: Технология контейнеризации для упаковки и запуска микросервисов.
* Docker Compose: Инструмент для определения и управления многоконтейнерными приложениями, использован для запуска микросервисов и шлюза Nginx.
* Nginx: Сетевой прокси-сервер, использован как шлюз для доступа к микросервисам.
### Функциональность:
#### Микросервис vacancies-service:
* Реализует CRUD операции для вакансий (GET, POST, PUT, DELETE).
* Сохраняет данные о вакансиях в памяти (в словаре vacancies).
* Получает информацию о резюме из микросервиса resumes-service через HTTP-запрос.
* Включает информацию о резюме в ответ JSON для вакансии.
#### Микросервис resumes-service:
* Реализует CRUD операции для резюме (GET, POST, PUT, DELETE).
* Сохраняет данные о резюме в памяти (в словаре resumes).
#### Шлюз Nginx:
* Перенаправляет HTTP-запросы на соответствующие микросервисы.
* Предоставляет единую точку входа для доступа к микросервисам.
### Запуск программы:
* Запуск команды docker-compose up -d
### Ссылка на видео:
https://vk.com/artamonovat?z=video212084908_456239358%2Fvideos212084908%2Fpl_212084908_-2

View File

@ -1,11 +0,0 @@
FROM python:3.9
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "resume.py"]

View File

@ -1,3 +0,0 @@
flask
requests
flask_cors

View File

@ -1,79 +0,0 @@
import uuid
import json
from flask import Flask, request, jsonify
from flask_cors import CORS
class Resume:
def __init__(self, uuid, full_name, skills, experience, phone, email):
self.uuid = uuid
self.full_name = full_name
self.skills = skills
self.experience = experience
self.phone = phone
self.email = email
app = Flask(__name__)
CORS(app)
resumes = {}
@app.route("/resumes", methods=["GET"])
def get_resumes():
return jsonify([resume.__dict__ for resume in resumes.values()])
@app.route("/resumes/<resume_uuid>", methods=["GET"])
def get_resume(resume_uuid):
resume = resumes.get(resume_uuid)
if resume:
return jsonify(resume.__dict__)
else:
return jsonify({"error": "Resume not found"}), 404
@app.route("/resumes", methods=["POST"])
def create_resume():
data = request.get_json()
resume_uuid = str(uuid.uuid4())
resume = Resume(
resume_uuid,
data["full_name"],
data["skills"],
data["experience"],
data["phone"],
data["email"],
)
resumes[resume_uuid] = resume
return jsonify(resume.__dict__), 201
@app.route("/resumes/<resume_uuid>", methods=["PUT"])
def update_resume(resume_uuid):
resume = resumes.get(resume_uuid)
if resume:
data = request.get_json()
resume.full_name = data.get("full_name", resume.full_name)
resume.skills = data.get("skills", resume.skills)
resume.experience = data.get("experience", resume.experience)
resume.phone = data.get("phone", resume.phone)
resume.email = data.get("email", resume.email)
return jsonify(resume.__dict__)
else:
return jsonify({"error": "Resume not found"}), 404
@app.route("/resumes/<resume_uuid>", methods=["DELETE"])
def delete_resume(resume_uuid):
resume = resumes.get(resume_uuid)
if resume:
del resumes[resume_uuid]
return "", 200
else:
return jsonify({"error": "Resume not found"}), 404
if __name__ == "__main__":
app.run(debug=True, host="0.0.0.0", port=5001)

View File

@ -1,11 +0,0 @@
FROM python:3.9
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "vacancy.py"]

View File

@ -1,3 +0,0 @@
flask
requests
flask_cors

View File

@ -1,124 +0,0 @@
import uuid
import json
from flask import Flask, request, jsonify
from flask_cors import CORS
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry
class Vacancy:
def __init__(self, uuid, title, company, description, salary, resume_uuid):
self.uuid = uuid
self.title = title
self.company = company
self.description = description
self.salary = salary
self.resume_uuid = resume_uuid
self.resume_info = None
def to_dict(self):
if self.resume_info:
return {
"uuid": self.uuid,
"title": self.title,
"company": self.company,
"description": self.description,
"salary": self.salary,
"resume_uuid": self.resume_uuid,
"resume_info": self.resume_info
}
else:
return {
"uuid": self.uuid,
"title": self.title,
"company": self.company,
"description": self.description,
"salary": self.salary,
"resume_uuid": self.resume_uuid
}
app = Flask(__name__)
CORS(app)
vacancies = {}
@app.route("/vacancies", methods=["GET"])
def get_vacancies():
return jsonify([vacancy.to_dict() for vacancy in vacancies.values()])
@app.route("/vacancies/<vacancy_uuid>", methods=["GET"])
def get_vacancy(vacancy_uuid):
vacancy = vacancies.get(vacancy_uuid)
if vacancy:
if not vacancy.resume_info:
vacancy.resume_info = get_resume_info(vacancy.resume_uuid)
return jsonify(vacancy.to_dict())
else:
return jsonify({"error": "Vacancy not found"}), 404
@app.route("/vacancies", methods=["POST"])
def create_vacancy():
data = request.get_json()
vacancy_uuid = str(uuid.uuid4())
vacancy = Vacancy(
vacancy_uuid,
data["title"],
data["company"],
data["description"],
data["salary"],
data["resume_uuid"],
)
vacancies[vacancy_uuid] = vacancy
vacancy.resume_info = get_resume_info(vacancy.resume_uuid)
return jsonify(vacancy.to_dict()), 201
@app.route("/vacancies/<vacancy_uuid>", methods=["PUT"])
def update_vacancy(vacancy_uuid):
vacancy = vacancies.get(vacancy_uuid)
if vacancy:
data = request.get_json()
vacancy.title = data.get("title", vacancy.title)
vacancy.company = data.get("company", vacancy.company)
vacancy.description = data.get("description", vacancy.description)
vacancy.salary = data.get("salary", vacancy.salary)
vacancy.resume_uuid = data.get("resume_uuid", vacancy.resume_uuid)
vacancy.resume_info = get_resume_info(vacancy.resume_uuid)
return jsonify(vacancy.to_dict())
else:
return jsonify({"error": "Vacancy not found"}), 404
@app.route("/vacancies/<vacancy_uuid>", methods=["DELETE"])
def delete_vacancy(vacancy_uuid):
vacancy = vacancies.get(vacancy_uuid)
if vacancy:
del vacancies[vacancy_uuid]
return "", 200
else:
return jsonify({"error": "Vacancy not found"}), 404
def get_resume_info(resume_uuid):
url = f'http://resumes-service:5001/resumes/{resume_uuid}'
# Настройка retry механизма
retries = Retry(
total=3, # Максимальное количество повторов
status_forcelist=[429, 500, 502, 503, 504], # Коды статуса, для которых нужно повторить запрос
backoff_factor=0.3, # Время ожидания перед повторной попыткой
)
adapter = HTTPAdapter(max_retries=retries)
http = requests.Session()
http.mount("https://", adapter)
http.mount("http://", adapter)
try:
response = http.get(url)
if response.status_code == 200:
return response.json()
else:
return None
except requests.exceptions.RequestException as e:
print(f"Ошибка при запросе к resumes-service: {e}")
return None
if __name__ == "__main__":
app.run(debug=True, host="0.0.0.0", port=5000)

View File

@ -1,23 +0,0 @@
import pika
import json
import time
credentials = pika.PlainCredentials('guest', 'guest')
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost', credentials=credentials))
channel = connection.channel()
channel.queue_declare(queue='order_queue_1')
channel.queue_bind(exchange='order_events', queue='order_queue_1')
def callback(ch, method, properties, body):
event = json.loads(body.decode('utf-8'))
print(f'Получено событие (очередь 1): {event}')
print(f'Обработка заказа {event["order_id"]}...')
time.sleep(2)
channel.basic_consume(queue='order_queue_1', on_message_callback=callback, auto_ack=True)
print('Ожидание сообщений (очередь 1)...')
channel.start_consuming()

View File

@ -1,22 +0,0 @@
import pika
import json
import time
credentials = pika.PlainCredentials('guest', 'guest')
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost', credentials=credentials))
channel = connection.channel()
channel.queue_declare(queue='order_queue_2')
channel.queue_bind(exchange='order_events', queue='order_queue_2')
def callback(ch, method, properties, body):
event = json.loads(body.decode('utf-8'))
print(f'Получено событие (очередь 2): {event}')
print(f'Обработка заказа {event["order_id"]} завершена.')
channel.basic_consume(queue='order_queue_2', on_message_callback=callback, auto_ack=True)
print('Ожидание сообщений (очередь 2)...')
channel.start_consuming()

View File

@ -1,31 +0,0 @@
import pika
import json
import time
import random
credentials = pika.PlainCredentials('guest', 'guest')
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost', credentials=credentials))
channel = connection.channel()
channel.exchange_declare(exchange='order_events', exchange_type='fanout')
while True:
event = {
'event_type': 'order_created',
'order_id': random.randint(1000, 9999),
'customer_name': f'Клиент {random.randint(1, 100)}',
'product_name': f'Товар {random.randint(1, 10)}',
'quantity': random.randint(1, 10),
'timestamp': time.time()
}
channel.basic_publish(
exchange='order_events',
routing_key='',
body=json.dumps(event)
)
print(f'Опубликовано событие: {event}')
time.sleep(1)
connection.close()

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 89 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 125 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 109 KiB

View File

@ -1,26 +0,0 @@
## Лабораторная работа №4 ПИбд-42 Артамонова Татьяна
### Прохождение туториала
1. ![tutorial-1.png](images/tutorial-1.png)
2. ![tutorial-2.png](images/tutorial-2.png)
3. ![tutorial-3.png](images/tutorial-3.png)
### Запуск приложений и анализ скорости обработки
1. Запуск Publisher, Consumer 1 и Consumer 2.
* Consumer 1: Очередь order_queue_1 будет иметь некоторое количество сообщений, так как Consumer 1 обрабатывает сообщения с задержкой в 2 секунды.
* Consumer 2: Очередь order_queue_2 будет практически пустой, так как Consumer 2 обрабатывает сообщения моментально.
2. Запуск нескольких копий Consumer 1
* Очередь order_queue_1 будет иметь меньше сообщений, так как 3 Consumer-а обрабатывают сообщения быстрее, чем 1 Consumer.
![queue1.png](images/queue1.png)
![queue2.png](images/queue2.png)
![exchange.png](images/exchange.png)
![consumers.png](images/consumers.png)
### Видео
https://vk.com/video/@artamonovat?section=upload&z=video212084908_456239359

View File

@ -1,25 +0,0 @@
import pika, sys, os
def main():
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
def callback(ch, method, properties, body):
print(f" [x] Received {body}")
channel.basic_consume(queue='hello', on_message_callback=callback, auto_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
print('Interrupted')
try:
sys.exit(0)
except SystemExit:
os._exit(0)

View File

@ -1,11 +0,0 @@
import pika
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
channel.basic_publish(exchange='', routing_key='hello', body='Hello World!')
print(" [x] Sent 'Hello World!'")
connection.close()

View File

@ -1,19 +0,0 @@
import pika
import sys
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
message = ' '.join(sys.argv[1:]) or "Hello World!"
channel.basic_publish(
exchange='',
routing_key='task_queue',
body=message,
properties=pika.BasicProperties(
delivery_mode=pika.DeliveryMode.Persistent
))
print(f" [x] Sent {message}")
connection.close()

View File

@ -1,22 +0,0 @@
import pika
import time
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(f" [x] Received {body.decode()}")
time.sleep(body.count(b'.'))
print(" [x] Done")
ch.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_qos(prefetch_count=1)
channel.basic_consume(queue='task_queue', on_message_callback=callback)
channel.start_consuming()

View File

@ -1,13 +0,0 @@
import pika
import sys
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='logs', exchange_type='fanout')
message = ' '.join(sys.argv[1:]) or "info: Hello World!"
channel.basic_publish(exchange='logs', routing_key='', body=message)
print(f" [x] Sent {message}")
connection.close()

View File

@ -1,22 +0,0 @@
import pika
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='logs', exchange_type='fanout')
result = channel.queue_declare(queue='', exclusive=True)
queue_name = result.method.queue
channel.queue_bind(exchange='logs', queue=queue_name)
print(' [*] Waiting for logs. To exit press CTRL+C')
def callback(ch, method, properties, body):
print(f" [x] {body}")
channel.basic_consume(
queue=queue_name, on_message_callback=callback, auto_ack=True)
channel.start_consuming()

View File

@ -1,30 +0,0 @@
**/.classpath
**/.dockerignore
**/.env
**/.git
**/.gitignore
**/.project
**/.settings
**/.toolstarget
**/.vs
**/.vscode
**/*.*proj.user
**/*.dbmdl
**/*.jfm
**/azds.yaml
**/bin
**/charts
**/docker-compose*
**/Dockerfile*
**/node_modules
**/npm-debug.log
**/obj
**/secrets.dev.yaml
**/values.dev.yaml
LICENSE
README.md
!**/.gitignore
!.git/HEAD
!.git/config
!.git/packed-refs
!.git/refs/heads/**

View File

@ -1,484 +0,0 @@
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.
##
## Get latest from `dotnet new gitignore`
# dotenv files
.env
# User-specific files
*.rsuser
*.suo
*.user
*.userosscache
*.sln.docstates
# User-specific files (MonoDevelop/Xamarin Studio)
*.userprefs
# Mono auto generated files
mono_crash.*
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
[Ww][Ii][Nn]32/
[Aa][Rr][Mm]/
[Aa][Rr][Mm]64/
bld/
[Bb]in/
[Oo]bj/
[Ll]og/
[Ll]ogs/
# Visual Studio 2015/2017 cache/options directory
.vs/
# Uncomment if you have tasks that create the project's static files in wwwroot
#wwwroot/
# Visual Studio 2017 auto generated files
Generated\ Files/
# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*
# NUnit
*.VisualState.xml
TestResult.xml
nunit-*.xml
# Build Results of an ATL Project
[Dd]ebugPS/
[Rr]eleasePS/
dlldata.c
# Benchmark Results
BenchmarkDotNet.Artifacts/
# .NET
project.lock.json
project.fragment.lock.json
artifacts/
# Tye
.tye/
# ASP.NET Scaffolding
ScaffoldingReadMe.txt
# StyleCop
StyleCopReport.xml
# Files built by Visual Studio
*_i.c
*_p.c
*_h.h
*.ilk
*.meta
*.obj
*.iobj
*.pch
*.pdb
*.ipdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*_wpftmp.csproj
*.log
*.tlog
*.vspscc
*.vssscc
.builds
*.pidb
*.svclog
*.scc
# Chutzpah Test files
_Chutzpah*
# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opendb
*.opensdf
*.sdf
*.cachefile
*.VC.db
*.VC.VC.opendb
# Visual Studio profiler
*.psess
*.vsp
*.vspx
*.sap
# Visual Studio Trace Files
*.e2e
# TFS 2012 Local Workspace
$tf/
# Guidance Automation Toolkit
*.gpState
# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper
*.DotSettings.user
# TeamCity is a build add-in
_TeamCity*
# DotCover is a Code Coverage Tool
*.dotCover
# AxoCover is a Code Coverage Tool
.axoCover/*
!.axoCover/settings.json
# Coverlet is a free, cross platform Code Coverage Tool
coverage*.json
coverage*.xml
coverage*.info
# Visual Studio code coverage results
*.coverage
*.coveragexml
# NCrunch
_NCrunch_*
.*crunch*.local.xml
nCrunchTemp_*
# MightyMoose
*.mm.*
AutoTest.Net/
# Web workbench (sass)
.sass-cache/
# Installshield output folder
[Ee]xpress/
# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html
# Click-Once directory
publish/
# Publish Web Output
*.[Pp]ublish.xml
*.azurePubxml
# Note: Comment the next line if you want to checkin your web deploy settings,
# but database connection strings (with potential passwords) will be unencrypted
*.pubxml
*.publishproj
# Microsoft Azure Web App publish settings. Comment the next line if you want to
# checkin your Azure Web App publish settings, but sensitive information contained
# in these scripts will be unencrypted
PublishScripts/
# NuGet Packages
*.nupkg
# NuGet Symbol Packages
*.snupkg
# The packages folder can be ignored because of Package Restore
**/[Pp]ackages/*
# except build/, which is used as an MSBuild target.
!**/[Pp]ackages/build/
# Uncomment if necessary however generally it will be regenerated when needed
#!**/[Pp]ackages/repositories.config
# NuGet v3's project.json files produces more ignorable files
*.nuget.props
*.nuget.targets
# Microsoft Azure Build Output
csx/
*.build.csdef
# Microsoft Azure Emulator
ecf/
rcf/
# Windows Store app package directories and files
AppPackages/
BundleArtifacts/
Package.StoreAssociation.xml
_pkginfo.txt
*.appx
*.appxbundle
*.appxupload
# Visual Studio cache files
# files ending in .cache can be ignored
*.[Cc]ache
# but keep track of directories ending in .cache
!?*.[Cc]ache/
# Others
ClientBin/
~$*
*~
*.dbmdl
*.dbproj.schemaview
*.jfm
*.pfx
*.publishsettings
orleans.codegen.cs
# Including strong name files can present a security risk
# (https://github.com/github/gitignore/pull/2483#issue-259490424)
#*.snk
# Since there are multiple workflows, uncomment next line to ignore bower_components
# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
#bower_components/
# RIA/Silverlight projects
Generated_Code/
# Backup & report files from converting an old project file
# to a newer Visual Studio version. Backup files are not needed,
# because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm
ServiceFabricBackup/
*.rptproj.bak
# SQL Server files
*.mdf
*.ldf
*.ndf
# Business Intelligence projects
*.rdl.data
*.bim.layout
*.bim_*.settings
*.rptproj.rsuser
*- [Bb]ackup.rdl
*- [Bb]ackup ([0-9]).rdl
*- [Bb]ackup ([0-9][0-9]).rdl
# Microsoft Fakes
FakesAssemblies/
# GhostDoc plugin setting file
*.GhostDoc.xml
# Node.js Tools for Visual Studio
.ntvs_analysis.dat
node_modules/
# Visual Studio 6 build log
*.plg
# Visual Studio 6 workspace options file
*.opt
# Visual Studio 6 auto-generated workspace file (contains which files were open etc.)
*.vbw
# Visual Studio 6 auto-generated project file (contains which files were open etc.)
*.vbp
# Visual Studio 6 workspace and project file (working project files containing files to include in project)
*.dsw
*.dsp
# Visual Studio 6 technical files
*.ncb
*.aps
# Visual Studio LightSwitch build output
**/*.HTMLClient/GeneratedArtifacts
**/*.DesktopClient/GeneratedArtifacts
**/*.DesktopClient/ModelManifest.xml
**/*.Server/GeneratedArtifacts
**/*.Server/ModelManifest.xml
_Pvt_Extensions
# Paket dependency manager
.paket/paket.exe
paket-files/
# FAKE - F# Make
.fake/
# CodeRush personal settings
.cr/personal
# Python Tools for Visual Studio (PTVS)
__pycache__/
*.pyc
# Cake - Uncomment if you are using it
# tools/**
# !tools/packages.config
# Tabs Studio
*.tss
# Telerik's JustMock configuration file
*.jmconfig
# BizTalk build output
*.btp.cs
*.btm.cs
*.odx.cs
*.xsd.cs
# OpenCover UI analysis results
OpenCover/
# Azure Stream Analytics local run output
ASALocalRun/
# MSBuild Binary and Structured Log
*.binlog
# NVidia Nsight GPU debugger configuration file
*.nvuser
# MFractors (Xamarin productivity tool) working folder
.mfractor/
# Local History for Visual Studio
.localhistory/
# Visual Studio History (VSHistory) files
.vshistory/
# BeatPulse healthcheck temp database
healthchecksdb
# Backup folder for Package Reference Convert tool in Visual Studio 2017
MigrationBackup/
# Ionide (cross platform F# VS Code tools) working folder
.ionide/
# Fody - auto-generated XML schema
FodyWeavers.xsd
# VS Code files for those working on multiple tools
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
*.code-workspace
# Local History for Visual Studio Code
.history/
# Windows Installer files from build outputs
*.cab
*.msi
*.msix
*.msm
*.msp
# JetBrains Rider
*.sln.iml
.idea
##
## Visual studio for Mac
##
# globs
Makefile.in
*.userprefs
*.usertasks
config.make
config.status
aclocal.m4
install-sh
autom4te.cache/
*.tar.gz
tarballs/
test-results/
# Mac bundle stuff
*.dmg
*.app
# content below from: https://github.com/github/gitignore/blob/master/Global/macOS.gitignore
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
# content below from: https://github.com/github/gitignore/blob/master/Global/Windows.gitignore
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db
# Dump file
*.stackdump
# Folder config file
[Dd]esktop.ini
# Recycle Bin used on file shares
$RECYCLE.BIN/
# Windows Installer files
*.cab
*.msi
*.msix
*.msm
*.msp
# Windows shortcuts
*.lnk
# Vim temporary swap files
*.swp

View File

@ -1,18 +0,0 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<UserSecretsId>9589d6f5-875c-4d7d-8e68-37a2077e80be</UserSecretsId>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
<DockerfileContext>.</DockerfileContext>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="8.0.10" />
<PackageReference Include="Microsoft.VisualStudio.Azure.Containers.Tools.Targets" Version="1.21.0" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.4.0" />
</ItemGroup>
</Project>

View File

@ -1,6 +0,0 @@
@ApiRestaurant_HostAddress = http://localhost:5089
GET {{ApiRestaurant_HostAddress}}/weatherforecast/
Accept: application/json
###

View File

@ -1,25 +0,0 @@

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.11.35222.181
MinimumVisualStudioVersion = 10.0.40219.1
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "ApiRestaurant", "ApiRestaurant.csproj", "{6E19ADDC-7351-4145-9C49-B0CC87BD1206}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{6E19ADDC-7351-4145-9C49-B0CC87BD1206}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{6E19ADDC-7351-4145-9C49-B0CC87BD1206}.Debug|Any CPU.Build.0 = Debug|Any CPU
{6E19ADDC-7351-4145-9C49-B0CC87BD1206}.Release|Any CPU.ActiveCfg = Release|Any CPU
{6E19ADDC-7351-4145-9C49-B0CC87BD1206}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {C62DE3C6-63AA-4075-9729-9F5ECD4E7B51}
EndGlobalSection
EndGlobal

View File

@ -1,59 +0,0 @@
using ApiRestaurant.Models;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
namespace ApiRestaurant.Controllers;
[Route("api/[controller]")]
[ApiController]
public class RestaurantController : ControllerBase
{
private RestaurantContext _context;
public RestaurantController(RestaurantContext context)
{
_context = context;
}
[HttpGet]
public Task<List<Restaurant>> GetAll() {
return _context.Restaurants.ToListAsync();
}
[HttpGet("{id}")]
public async Task<ActionResult<Restaurant?>> GetOne(Guid id) {
var restaurant = await _context.Restaurants.FindAsync(id);
if (restaurant == null) return NotFound($"Restourant with [id: '{id}'] not found");
var waiters = await WaiterApiClient.GetAllForRestaurant(id);
restaurant.Waiters = waiters;
return restaurant;
}
[HttpPost]
public ActionResult<Restaurant> Create(RestaurantDTO restaurant) {
var newRestaurant = new Restaurant { Name = restaurant.Name };
var res = _context.Restaurants.Add(newRestaurant);
_context.SaveChanges();
return res.Entity;
}
[HttpPut("{id}")]
public ActionResult<Restaurant> Update(Guid id, RestaurantDTO restaurant) {
var oldRestaurant = _context.Restaurants.FirstOrDefault(r => r.Id == id);
if (oldRestaurant == null) return NotFound();
oldRestaurant.Name = restaurant.Name;
var res = _context.Restaurants.Update(oldRestaurant);
_context.SaveChangesAsync();
return res.Entity;
}
[HttpDelete]
public ActionResult Delete(Guid id) {
var restaurant = _context.Restaurants.FirstOrDefault(r => r.Id == id);
if (restaurant is null) return NotFound();
_context.Restaurants.Remove(restaurant);
_context.SaveChangesAsync();
return Ok();
}
}
public record RestaurantDTO(string Name);

View File

@ -1,30 +0,0 @@
# См. статью по ссылке https://aka.ms/customizecontainer, чтобы узнать как настроить контейнер отладки и как Visual Studio использует этот Dockerfile для создания образов для ускорения отладки.
# Этот этап используется при запуске из VS в быстром режиме (по умолчанию для конфигурации отладки)
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS base
USER app
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
# Этот этап используется для сборки проекта службы
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["ApiRestaurant.csproj", "."]
RUN dotnet restore "./ApiRestaurant.csproj"
COPY . .
WORKDIR "/src/."
RUN dotnet build "./ApiRestaurant.csproj" -c $BUILD_CONFIGURATION -o /app/build
# Этот этап используется для публикации проекта службы, который будет скопирован на последний этап
FROM build AS publish
ARG BUILD_CONFIGURATION=Release
RUN dotnet publish "./ApiRestaurant.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false
# Этот этап используется в рабочей среде или при запуске из VS в обычном режиме (по умолчанию, когда конфигурация отладки не используется)
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "ApiRestaurant.dll"]

View File

@ -1,8 +0,0 @@
namespace ApiRestaurant.Models;
public class Restaurant
{
public Guid Id { get; set; }
public string Name { get; set; } = string.Empty;
public List<Waiter>? Waiters { get; set; }
}

View File

@ -1,13 +0,0 @@
using Microsoft.EntityFrameworkCore;
namespace ApiRestaurant.Models;
public class RestaurantContext : DbContext
{
public RestaurantContext(DbContextOptions<RestaurantContext> options)
: base(options)
{
}
public DbSet<Restaurant> Restaurants { get; set; } = null!;
}

View File

@ -1,7 +0,0 @@
namespace ApiRestaurant.Models;
public class Waiter
{
public Guid Id { get; set; }
public string Name { get; set; } = string.Empty;
}

View File

@ -1,28 +0,0 @@
namespace ApiRestaurant;
public static class NetworkSupport
{
public static async Task CheckConnectionAsync(string address)
{
using (var client = new HttpClient())
{
try
{
var response = await client.GetAsync(address);
if (response.IsSuccessStatusCode)
{
Console.WriteLine($"Соединение успешно проверено. Статус-код: {response.StatusCode}");
}
else
{
Console.WriteLine($"Соединение не удалось проверить. Статус-код: {response.StatusCode}");
}
}
catch (HttpRequestException ex)
{
Console.WriteLine($"Ошибка при проверке соединения: {ex.Message}");
}
}
}
}

View File

@ -1,36 +0,0 @@
using Microsoft.EntityFrameworkCore;
using ApiRestaurant.Models;
using ApiRestaurant;
var builder = WebApplication.CreateBuilder(args);
var waiterApiAddress =
Environment.GetEnvironmentVariable("WAITER_API_URL")
?? "http://localhost:4000/waiters/";
WaiterApiClient.Setup(waiterApiAddress);
await NetworkSupport.CheckConnectionAsync(waiterApiAddress);
// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddDbContext<RestaurantContext>(opt =>
opt.UseInMemoryDatabase("ApiRestaurant"));
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();

View File

@ -1,52 +0,0 @@
{
"profiles": {
"http": {
"commandName": "Project",
"launchBrowser": true,
"launchUrl": "swagger",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
},
"dotnetRunMessages": true,
"applicationUrl": "http://localhost:5089"
},
"https": {
"commandName": "Project",
"launchBrowser": true,
"launchUrl": "swagger",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
},
"dotnetRunMessages": true,
"applicationUrl": "https://localhost:7269;http://localhost:5089"
},
"IIS Express": {
"commandName": "IISExpress",
"launchBrowser": true,
"launchUrl": "swagger",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
},
"Container (Dockerfile)": {
"commandName": "Docker",
"launchBrowser": true,
"launchUrl": "{Scheme}://{ServiceHost}:{ServicePort}/swagger",
"environmentVariables": {
"ASPNETCORE_HTTPS_PORTS": "8081",
"ASPNETCORE_HTTP_PORTS": "8080"
},
"publishAllPorts": true,
"useSSL": true
}
},
"$schema": "http://json.schemastore.org/launchsettings.json",
"iisSettings": {
"windowsAuthentication": false,
"anonymousAuthentication": true,
"iisExpress": {
"applicationUrl": "http://localhost:57319",
"sslPort": 44369
}
}
}

View File

@ -1,26 +0,0 @@
using ApiRestaurant.Models;
namespace ApiRestaurant;
public static class WaiterApiClient
{
public static string WaiterApiAddress { get; private set; } = string.Empty;
private static readonly HttpClient client = new();
public static void Setup(string waiterApiAddress)
{
WaiterApiAddress = waiterApiAddress;
}
public static async Task<List<Waiter>> GetAllForRestaurant(Guid restaurantId) {
try
{
var waiters = await client.GetFromJsonAsync<List<Waiter>>(WaiterApiAddress + $"from-restaurant/{restaurantId}");
return waiters ?? [];
}
catch (HttpRequestException ex)
{
Console.WriteLine($"Error fetching waiters: {ex.Message}");
throw;
}
}
}

View File

@ -1,8 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
}
}

View File

@ -1,9 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*"
}

View File

@ -1 +0,0 @@
node_modules

View File

@ -1,9 +0,0 @@
FROM node:23-alpine
WORKDIR /app
COPY package*.json ./
RUN npm i
COPY . .
ENTRYPOINT ["npm", "run", "start"]

View File

@ -1,4 +0,0 @@
{
"waiters": {
}
}

View File

@ -1,82 +0,0 @@
import express from 'express';
import { JsonDB, Config } from 'node-json-db';
const app = express();
const db = new JsonDB(new Config("db", true, true, '/'));
const port = process.env.PORT || 4000;
app.use(express.json());
// CORS settings
app.use((req, res, next) => {
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE');
res.setHeader('Access-Control-Allow-Headers', 'Content-Type');
next();
});
// Get all
app.get('/waiters', async (req, res) => {
res.send(await db.getData('/waiters'));
});
// Create one
app.post('/waiters', async (req, res) => {
let newWaiter = req.body;
// Generate id
newWaiter.id = crypto.randomUUID();
await db.push("/waiters/" + newWaiter.id, newWaiter);
res.send(newWaiter);
});
// Get one
app.get('/waiters/:id', async (req, res) => {
const id = req.params.id;
const waiter = await db.getData('/waiters/' + id);
if (!waiter) {
res.status(404).json({ message: 'Waiter not found' });
return;
}
res.send(waiter);
});
// Get all for one restaurant
app.get('/waiters/from-restaurant/:id', async (req, res) => {
const restaurantId = req.params.id;
const waiters = await db.filter('/waiters', w => w.restaurant_id === restaurantId);
res.send(waiters ?? []);
});
// Update one
app.put('/waiters/:id', async (req, res) => {
const id = req.params.id;
const updatedWaiter = req.body;
const waiterExists = await db.exists('/waiters/' + id);
if (!waiterExists) {
res.status(404).json({ message: 'Waiter not found' });
return;
}
updatedWaiter.id = id;
await db.push('/waiters/' + id, updatedWaiter, true);
res.send(updatedWaiter);
});
// Delete one
app.delete('/waiters/:id', async (req, res) => {
const id = req.params.id;
const waiterExists = await db.exists('/waiters/' + id);
if (!waiterExists) {
res.status(404).json({ message: 'Waiter not found' });
return;
}
await db.delete('/waiters/' + id);
res.status(200);
});
app.listen(port)

View File

@ -1,789 +0,0 @@
{
"name": "apiwaiter",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "apiwaiter",
"version": "1.0.0",
"license": "ISC",
"dependencies": {
"express": "^4.21.1",
"node-json-db": "^2.3.0"
}
},
"node_modules/accepts": {
"version": "1.3.8",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
"integrity": "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw==",
"license": "MIT",
"dependencies": {
"mime-types": "~2.1.34",
"negotiator": "0.6.3"
},
"engines": {
"node": ">= 0.6"
}
},
"node_modules/array-flatten": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg==",
"license": "MIT"
},
"node_modules/body-parser": {
"version": "1.20.3",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.3.tgz",
"integrity": "sha512-7rAxByjUMqQ3/bHJy7D6OGXvx/MMc4IqBn/X0fcM1QUcAItpZrBEYhWGem+tzXH90c+G01ypMcYJBO9Y30203g==",
"license": "MIT",
"dependencies": {
"bytes": "3.1.2",
"content-type": "~1.0.5",
"debug": "2.6.9",
"depd": "2.0.0",
"destroy": "1.2.0",
"http-errors": "2.0.0",
"iconv-lite": "0.4.24",
"on-finished": "2.4.1",
"qs": "6.13.0",
"raw-body": "2.5.2",
"type-is": "~1.6.18",
"unpipe": "1.0.0"
},
"engines": {
"node": ">= 0.8",
"npm": "1.2.8000 || >= 1.4.16"
}
},
"node_modules/bytes": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
"integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/call-bind": {
"version": "1.0.7",
"resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.7.tgz",
"integrity": "sha512-GHTSNSYICQ7scH7sZ+M2rFopRoLh8t2bLSW6BbgrtLsahOIB5iyAVJf9GjWK3cYTDaMj4XdBpM1cA6pIS0Kv2w==",
"license": "MIT",
"dependencies": {
"es-define-property": "^1.0.0",
"es-errors": "^1.3.0",
"function-bind": "^1.1.2",
"get-intrinsic": "^1.2.4",
"set-function-length": "^1.2.1"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/content-disposition": {
"version": "0.5.4",
"resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.4.tgz",
"integrity": "sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ==",
"license": "MIT",
"dependencies": {
"safe-buffer": "5.2.1"
},
"engines": {
"node": ">= 0.6"
}
},
"node_modules/content-type": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz",
"integrity": "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/cookie": {
"version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/cookie-signature": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.6.tgz",
"integrity": "sha512-QADzlaHc8icV8I7vbaJXJwod9HWYp8uCqf1xa4OfNu1T7JVxQIrUgOWtHdNDtPiywmFbiS12VjotIXLrKM3orQ==",
"license": "MIT"
},
"node_modules/debug": {
"version": "2.6.9",
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
"integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
"license": "MIT",
"dependencies": {
"ms": "2.0.0"
}
},
"node_modules/define-data-property": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/define-data-property/-/define-data-property-1.1.4.tgz",
"integrity": "sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A==",
"license": "MIT",
"dependencies": {
"es-define-property": "^1.0.0",
"es-errors": "^1.3.0",
"gopd": "^1.0.1"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/depd": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
"integrity": "sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/destroy": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/destroy/-/destroy-1.2.0.tgz",
"integrity": "sha512-2sJGJTaXIIaR1w4iJSNoN0hnMY7Gpc/n8D4qSCJw8QqFWXf7cuAgnEHxBpweaVcPevC2l3KpjYCx3NypQQgaJg==",
"license": "MIT",
"engines": {
"node": ">= 0.8",
"npm": "1.2.8000 || >= 1.4.16"
}
},
"node_modules/ee-first": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/ee-first/-/ee-first-1.1.1.tgz",
"integrity": "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow==",
"license": "MIT"
},
"node_modules/encodeurl": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
"integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/es-define-property": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.0.tgz",
"integrity": "sha512-jxayLKShrEqqzJ0eumQbVhTYQM27CfT1T35+gCgDFoL82JLsXqTJ76zv6A0YLOgEnLUMvLzsDsGIrl8NFpT2gQ==",
"license": "MIT",
"dependencies": {
"get-intrinsic": "^1.2.4"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/es-errors": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
"integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
"license": "MIT",
"engines": {
"node": ">= 0.4"
}
},
"node_modules/escape-html": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
"integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==",
"license": "MIT"
},
"node_modules/etag": {
"version": "1.8.1",
"resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
"integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/express": {
"version": "4.21.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
"integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
"license": "MIT",
"dependencies": {
"accepts": "~1.3.8",
"array-flatten": "1.1.1",
"body-parser": "1.20.3",
"content-disposition": "0.5.4",
"content-type": "~1.0.4",
"cookie": "0.7.1",
"cookie-signature": "1.0.6",
"debug": "2.6.9",
"depd": "2.0.0",
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"etag": "~1.8.1",
"finalhandler": "1.3.1",
"fresh": "0.5.2",
"http-errors": "2.0.0",
"merge-descriptors": "1.0.3",
"methods": "~1.1.2",
"on-finished": "2.4.1",
"parseurl": "~1.3.3",
"path-to-regexp": "0.1.10",
"proxy-addr": "~2.0.7",
"qs": "6.13.0",
"range-parser": "~1.2.1",
"safe-buffer": "5.2.1",
"send": "0.19.0",
"serve-static": "1.16.2",
"setprototypeof": "1.2.0",
"statuses": "2.0.1",
"type-is": "~1.6.18",
"utils-merge": "1.0.1",
"vary": "~1.1.2"
},
"engines": {
"node": ">= 0.10.0"
}
},
"node_modules/finalhandler": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.3.1.tgz",
"integrity": "sha512-6BN9trH7bp3qvnrRyzsBz+g3lZxTNZTbVO2EV1CS0WIcDbawYVdYvGflME/9QP0h0pYlCDBCTjYa9nZzMDpyxQ==",
"license": "MIT",
"dependencies": {
"debug": "2.6.9",
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"on-finished": "2.4.1",
"parseurl": "~1.3.3",
"statuses": "2.0.1",
"unpipe": "~1.0.0"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/forwarded": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
"integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/fresh": {
"version": "0.5.2",
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
"integrity": "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/function-bind": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
"integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/get-intrinsic": {
"version": "1.2.4",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
"integrity": "sha512-5uYhsJH8VJBTv7oslg4BznJYhDoRI6waYCxMmCdnTrcCrHA/fCFKoTFz2JKKE0HdDFUF7/oQuhzumXJK7paBRQ==",
"license": "MIT",
"dependencies": {
"es-errors": "^1.3.0",
"function-bind": "^1.1.2",
"has-proto": "^1.0.1",
"has-symbols": "^1.0.3",
"hasown": "^2.0.0"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/gopd": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz",
"integrity": "sha512-d65bNlIadxvpb/A2abVdlqKqV563juRnZ1Wtk6s1sIR8uNsXR70xqIzVqxVf1eTqDunwT2MkczEeaezCKTZhwA==",
"license": "MIT",
"dependencies": {
"get-intrinsic": "^1.1.3"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/has-property-descriptors": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/has-property-descriptors/-/has-property-descriptors-1.0.2.tgz",
"integrity": "sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg==",
"license": "MIT",
"dependencies": {
"es-define-property": "^1.0.0"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/has-proto": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.0.3.tgz",
"integrity": "sha512-SJ1amZAJUiZS+PhsVLf5tGydlaVB8EdFpaSO4gmiUKUOxk8qzn5AIy4ZeJUmh22znIdk/uMAUT2pl3FxzVUH+Q==",
"license": "MIT",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/has-symbols": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.3.tgz",
"integrity": "sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A==",
"license": "MIT",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/hasown": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
"integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
"license": "MIT",
"dependencies": {
"function-bind": "^1.1.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/http-errors": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.0.tgz",
"integrity": "sha512-FtwrG/euBzaEjYeRqOgly7G0qviiXoJWnvEH2Z1plBdXgbyjv34pHTSb9zoeHMyDy33+DWy5Wt9Wo+TURtOYSQ==",
"license": "MIT",
"dependencies": {
"depd": "2.0.0",
"inherits": "2.0.4",
"setprototypeof": "1.2.0",
"statuses": "2.0.1",
"toidentifier": "1.0.1"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/iconv-lite": {
"version": "0.4.24",
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz",
"integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
"license": "MIT",
"dependencies": {
"safer-buffer": ">= 2.1.2 < 3"
},
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/inherits": {
"version": "2.0.4",
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
"license": "ISC"
},
"node_modules/ipaddr.js": {
"version": "1.9.1",
"resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz",
"integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==",
"license": "MIT",
"engines": {
"node": ">= 0.10"
}
},
"node_modules/media-typer": {
"version": "0.3.0",
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
"integrity": "sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/merge-descriptors": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-1.0.3.tgz",
"integrity": "sha512-gaNvAS7TZ897/rVaZ0nMtAyxNyi/pdbjbAwUpFQpN70GqnVfOiXpeUUMKRBmzXaSQ8DdTX4/0ms62r2K+hE6mQ==",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/methods": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz",
"integrity": "sha512-iclAHeNqNm68zFtnZ0e+1L2yUIdvzNoauKU4WBA3VvH/vPFieF7qfRlwUZU+DA9P9bPXIS90ulxoUoCH23sV2w==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/mime": {
"version": "1.6.0",
"resolved": "https://registry.npmjs.org/mime/-/mime-1.6.0.tgz",
"integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==",
"license": "MIT",
"bin": {
"mime": "cli.js"
},
"engines": {
"node": ">=4"
}
},
"node_modules/mime-db": {
"version": "1.52.0",
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/mime-types": {
"version": "2.1.35",
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
"license": "MIT",
"dependencies": {
"mime-db": "1.52.0"
},
"engines": {
"node": ">= 0.6"
}
},
"node_modules/ms": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
"license": "MIT"
},
"node_modules/negotiator": {
"version": "0.6.3",
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.3.tgz",
"integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/node-json-db": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/node-json-db/-/node-json-db-2.3.0.tgz",
"integrity": "sha512-B8T+w4q6zXZ20YcfQINLSjMGgImRKzkvR0ShYYoNRdLxtMhVvbzaMBzNdEaRcCjilW/lKS+g9CwVXNoK5uTncw==",
"license": "MIT",
"dependencies": {
"rwlock": "^5.0.0"
}
},
"node_modules/object-inspect": {
"version": "1.13.2",
"resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.2.tgz",
"integrity": "sha512-IRZSRuzJiynemAXPYtPe5BoI/RESNYR7TYm50MC5Mqbd3Jmw5y790sErYw3V6SryFJD64b74qQQs9wn5Bg/k3g==",
"license": "MIT",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/on-finished": {
"version": "2.4.1",
"resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.4.1.tgz",
"integrity": "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==",
"license": "MIT",
"dependencies": {
"ee-first": "1.1.1"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/parseurl": {
"version": "1.3.3",
"resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz",
"integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/path-to-regexp": {
"version": "0.1.10",
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.10.tgz",
"integrity": "sha512-7lf7qcQidTku0Gu3YDPc8DJ1q7OOucfa/BSsIwjuh56VU7katFvuM8hULfkwB3Fns/rsVF7PwPKVw1sl5KQS9w==",
"license": "MIT"
},
"node_modules/proxy-addr": {
"version": "2.0.7",
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
"integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
"license": "MIT",
"dependencies": {
"forwarded": "0.2.0",
"ipaddr.js": "1.9.1"
},
"engines": {
"node": ">= 0.10"
}
},
"node_modules/qs": {
"version": "6.13.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.13.0.tgz",
"integrity": "sha512-+38qI9SOr8tfZ4QmJNplMUxqjbe7LKvvZgWdExBOmd+egZTtjLB67Gu0HRX3u/XOq7UU2Nx6nsjvS16Z9uwfpg==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.0.6"
},
"engines": {
"node": ">=0.6"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/range-parser": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
"integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/raw-body": {
"version": "2.5.2",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.2.tgz",
"integrity": "sha512-8zGqypfENjCIqGhgXToC8aB2r7YrBX+AQAfIPs/Mlk+BtPTztOvTS01NRW/3Eh60J+a48lt8qsCzirQ6loCVfA==",
"license": "MIT",
"dependencies": {
"bytes": "3.1.2",
"http-errors": "2.0.0",
"iconv-lite": "0.4.24",
"unpipe": "1.0.0"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/rwlock": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/rwlock/-/rwlock-5.0.0.tgz",
"integrity": "sha512-XgzRqLMfCcm9QfZuPav9cV3Xin5TRcIlp4X/SH3CvB+x5D2AakdlEepfJKDd8ByncvfpcxNWdRZVUl38PS6ZJg==",
"license": "MIT"
},
"node_modules/safe-buffer": {
"version": "5.2.1",
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
"integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT"
},
"node_modules/safer-buffer": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
"license": "MIT"
},
"node_modules/send": {
"version": "0.19.0",
"resolved": "https://registry.npmjs.org/send/-/send-0.19.0.tgz",
"integrity": "sha512-dW41u5VfLXu8SJh5bwRmyYUbAoSB3c9uQh6L8h/KtsFREPWpbX1lrljJo186Jc4nmci/sGUZ9a0a0J2zgfq2hw==",
"license": "MIT",
"dependencies": {
"debug": "2.6.9",
"depd": "2.0.0",
"destroy": "1.2.0",
"encodeurl": "~1.0.2",
"escape-html": "~1.0.3",
"etag": "~1.8.1",
"fresh": "0.5.2",
"http-errors": "2.0.0",
"mime": "1.6.0",
"ms": "2.1.3",
"on-finished": "2.4.1",
"range-parser": "~1.2.1",
"statuses": "2.0.1"
},
"engines": {
"node": ">= 0.8.0"
}
},
"node_modules/send/node_modules/encodeurl": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-1.0.2.tgz",
"integrity": "sha512-TPJXq8JqFaVYm2CWmPvnP2Iyo4ZSM7/QKcSmuMLDObfpH5fi7RUGmd/rTDf+rut/saiDiQEeVTNgAmJEdAOx0w==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/send/node_modules/ms": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
"license": "MIT"
},
"node_modules/serve-static": {
"version": "1.16.2",
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.16.2.tgz",
"integrity": "sha512-VqpjJZKadQB/PEbEwvFdO43Ax5dFBZ2UECszz8bQ7pi7wt//PWe1P6MN7eCnjsatYtBT6EuiClbjSWP2WrIoTw==",
"license": "MIT",
"dependencies": {
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"parseurl": "~1.3.3",
"send": "0.19.0"
},
"engines": {
"node": ">= 0.8.0"
}
},
"node_modules/set-function-length": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
"integrity": "sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg==",
"license": "MIT",
"dependencies": {
"define-data-property": "^1.1.4",
"es-errors": "^1.3.0",
"function-bind": "^1.1.2",
"get-intrinsic": "^1.2.4",
"gopd": "^1.0.1",
"has-property-descriptors": "^1.0.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/setprototypeof": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
"integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==",
"license": "ISC"
},
"node_modules/side-channel": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.0.6.tgz",
"integrity": "sha512-fDW/EZ6Q9RiO8eFG8Hj+7u/oW+XrPTIChwCOM2+th2A6OblDtYYIpve9m+KvI9Z4C9qSEXlaGR6bTEYHReuglA==",
"license": "MIT",
"dependencies": {
"call-bind": "^1.0.7",
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.4",
"object-inspect": "^1.13.1"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/statuses": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.1.tgz",
"integrity": "sha512-RwNA9Z/7PrK06rYLIzFMlaF+l73iwpzsqRIFgbMLbTcLD6cOao82TaWefPXQvB2fOC4AjuYSEndS7N/mTCbkdQ==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/toidentifier": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
"integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==",
"license": "MIT",
"engines": {
"node": ">=0.6"
}
},
"node_modules/type-is": {
"version": "1.6.18",
"resolved": "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz",
"integrity": "sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==",
"license": "MIT",
"dependencies": {
"media-typer": "0.3.0",
"mime-types": "~2.1.24"
},
"engines": {
"node": ">= 0.6"
}
},
"node_modules/unpipe": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz",
"integrity": "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/utils-merge": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/utils-merge/-/utils-merge-1.0.1.tgz",
"integrity": "sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA==",
"license": "MIT",
"engines": {
"node": ">= 0.4.0"
}
},
"node_modules/vary": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz",
"integrity": "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
}
}
}

View File

@ -1,17 +0,0 @@
{
"name": "apiwaiter",
"version": "1.0.0",
"description": "api for waiters",
"type": "module",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node index.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.21.1",
"node-json-db": "^2.3.0"
}
}

View File

@ -1,60 +0,0 @@
# Лабораторная работа номер 3
> Здравствуйте меня зовут Балахонов Данила группа ПИбд-42
>
> *— Балахонов Данила ПИбд-42*
Видео лабораторной работы номер 3 доступно по этой [ссылке](https://drive.google.com/file/d/1KplDSt-BUQl9OUTM9x7VZcGeokuLhdOD/view?usp=sharing).
## Как запустить лабораторную работу номер 3?
### Необходимые компоненты для запуска лабораторной работы номер 3
> Здесь рассказана установка необходимых компонентов для запуска лабораторной работы номер 3 под дистрибутив GNU/Linux **Ubuntu**.
Для запуска лабораторной работы номер 3 необходимы такие компоненты:
- Git
- Docker
- Docker compose
Чтобы установить **Git**, необходимо ввести данные команды в командную строку:
``` bash
sudo apt-get update
sudo apt-get install git
```
Чтобы установить **Docker** и **Docker compose**, стоит ввести такие команды:
``` bash
# Настройка репозитория Docker
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
# Установка Docker и его компонентов
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
```
### Запуск лабораторной работы номер 3
Для запуска лабораторной работы номер 3 необходимо **склонировать** репозиторий в любую папку и **перейти на ветку** balakhonov_danila_lab_3.
Далее в папке с `docker-compose.yaml` нужно вызвать такую команду:
``` bash
sudo docker-compose up --build
```
Таким образом контейнеры будут подготовлены и запущены.
Доступ к функционалну системы происходит через nginx, через порт 80.
## Какие технологии были использованы?
Для выполнения лабораторной работы номер 3 были использованы такие технологии, как:
- Dockerfile
- Docker compose
- Git
- .NET SDK и С# в частности
- ASP.NET Core
- Node.js
- Express
## Что делает лабораторная работа номер 3?
Суть лабораторной работы номер 3 заключается в распределении функционала между несколькими сервисами, а также реализации взаимодействия между ними.

View File

@ -1,26 +0,0 @@
services:
restaurant_service:
build: ./ApiRestaurant/
environment:
WAITER_API_URL: http://waiter_service:4000/waiters/
expose:
- 8080
depends_on:
- waiter_service
waiter_service:
build: ./ApiWaiter/
environment:
PORT: 4000
expose:
- 4000
nginx:
image: nginx
depends_on:
- restaurant_service
- waiter_service
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf

View File

@ -1,27 +0,0 @@
events {
worker_connections 1024;
}
http {
server {
listen 80;
listen [::]:80;
server_name localhost;
location /waiters/ {
proxy_pass http://waiter_service:4000/waiters/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Prefix /test;
}
location /restaurants/ {
proxy_pass http://restaurant_service:8080/api/restaurant/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Prefix /admin;
}
}
}

View File

@ -1,6 +1,11 @@
# Распределенные вычисления и приложения Л3 # Распределенные вычисления и приложения Л2
## _Автор Базунов Андрей Игревич ПИбд-42_ ## _Автор Базунов Андрей Игревич ПИбд-42_
Сервисы ( _порядок исполнения сервисов соблюден_ ):
- 1.FileCreator - (_Создание тестовых данных_)
- 2.FirstService - (_Выполнение 1.4 варианта задания_)
- 3.SecondService - (_Выполнение 2.2 варианта задания_)
В качестве основного языка был выбран GoLang. Для каждого сервиса был создан DOCKERFILE где были прописаны условия и действия для сборки каждого из модулей В качестве основного языка был выбран GoLang. Для каждого сервиса был создан DOCKERFILE где были прописаны условия и действия для сборки каждого из модулей
# Docker # Docker
@ -22,4 +27,4 @@ docker-compose up -d --build
docker-compose down docker-compose down
``` ```
[Демонстрация работы](https://vk.com/video/@viltskaa?z=video236673313_456239577%2Fpl_236673313_-2) [Демонстрация работы](https://vk.com/video236673313_456239575)

Some files were not shown because too many files have changed in this diff Show More