mirror of
https://github.com/TandoorRecipes/recipes.git
synced 2025-12-25 11:19:39 -05:00
Compare commits
139 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8572f338ad | ||
|
|
920ec8e74b | ||
|
|
2328bf2342 | ||
|
|
85620a1431 | ||
|
|
0037858885 | ||
|
|
9df3ff0028 | ||
|
|
0a43272126 | ||
|
|
ff96eb194f | ||
|
|
6b69c4184b | ||
|
|
e90e21181c | ||
|
|
5237228673 | ||
|
|
ecb3172085 | ||
|
|
b4f4e9fd2b | ||
|
|
6d0f3b99c8 | ||
|
|
cdb94ae628 | ||
|
|
0d589444fd | ||
|
|
95fa420c3a | ||
|
|
dd4dc1083f | ||
|
|
04f889b742 | ||
|
|
67d374c071 | ||
|
|
8d749e351d | ||
|
|
417ffcab5d | ||
|
|
0c0012aab8 | ||
|
|
e562883da3 | ||
|
|
a81bc335cc | ||
|
|
ebee1ccd4b | ||
|
|
b1104b4581 | ||
|
|
f5e952d88c | ||
|
|
968fcc3936 | ||
|
|
73d3d87217 | ||
|
|
9050f648f9 | ||
|
|
a4a9e104b5 | ||
|
|
3ed85ea0c4 | ||
|
|
da8ceb7abe | ||
|
|
5ff3a6bb2e | ||
|
|
3ed750b330 | ||
|
|
0315911802 | ||
|
|
1fe96f2b3d | ||
|
|
11761c0b15 | ||
|
|
a7b0a1ab30 | ||
|
|
e4fcae3b00 | ||
|
|
b0401639f1 | ||
|
|
5f8770f502 | ||
|
|
aaa627d3b6 | ||
|
|
e77734f696 | ||
|
|
73df6bb961 | ||
|
|
7d187b638e | ||
|
|
68bb750f8c | ||
|
|
1e35035540 | ||
|
|
561ba2f1da | ||
|
|
bd600301f9 | ||
|
|
cf5483a4d9 | ||
|
|
ba417c49dd | ||
|
|
0259b1dc08 | ||
|
|
34553dadd7 | ||
|
|
535b88c8db | ||
|
|
71eb8818b5 | ||
|
|
0dc94a817f | ||
|
|
4df862c7f3 | ||
|
|
69f013c980 | ||
|
|
23c420dda8 | ||
|
|
63daf1e958 | ||
|
|
52b44eacdd | ||
|
|
fa7cc12b99 | ||
|
|
64d2108ef6 | ||
|
|
dccfdcc11c | ||
|
|
974e72631d | ||
|
|
70f31b8553 | ||
|
|
3cb980c0e7 | ||
|
|
b8a403b7c1 | ||
|
|
b037d90220 | ||
|
|
ad32e457fa | ||
|
|
8e2726caeb | ||
|
|
e693737c57 | ||
|
|
9f239c06d3 | ||
|
|
0f551c5f88 | ||
|
|
eb224a769d | ||
|
|
4515eba9d7 | ||
|
|
30b37bf0b6 | ||
|
|
f17207e56e | ||
|
|
2cba0e18af | ||
|
|
ec6e81316a | ||
|
|
b72897b222 | ||
|
|
bca1ebbf99 | ||
|
|
f0342d4568 | ||
|
|
81f62de500 | ||
|
|
f783949a61 | ||
|
|
820fad1b5c | ||
|
|
1169abd942 | ||
|
|
48e175f58f | ||
|
|
5450e18342 | ||
|
|
ea590f8e49 | ||
|
|
13626ca11b | ||
|
|
f53fe1e3c4 | ||
|
|
d177316b47 | ||
|
|
338db1fac2 | ||
|
|
377619473c | ||
|
|
000962c5bb | ||
|
|
9228c1d59f | ||
|
|
27007de7a0 | ||
|
|
29c99b66a1 | ||
|
|
bc179f430d | ||
|
|
58c412ad95 | ||
|
|
4f248afe76 | ||
|
|
f722d24eaa | ||
|
|
723b74509f | ||
|
|
ad4b1393dd | ||
|
|
04bab7072c | ||
|
|
6391cee9eb | ||
|
|
14884fc0d4 | ||
|
|
f2191f79dd | ||
|
|
c2533d9ea2 | ||
|
|
db72fdb1bb | ||
|
|
78252662cb | ||
|
|
4e078bf477 | ||
|
|
2e9e226fe0 | ||
|
|
18cfbd80ab | ||
|
|
4d284b4fff | ||
|
|
b1128dd134 | ||
|
|
3aebf58406 | ||
|
|
f3816a77df | ||
|
|
e4183d79ab | ||
|
|
f4aa1a083f | ||
|
|
ed5508b576 | ||
|
|
040e247487 | ||
|
|
5d28c7b17d | ||
|
|
15b2df07f2 | ||
|
|
7be7c5b954 | ||
|
|
0853a9ec64 | ||
|
|
fa3daee965 | ||
|
|
774c05e76f | ||
|
|
b08c39e284 | ||
|
|
ae036cfa9a | ||
|
|
37628c1735 | ||
|
|
530a6db35c | ||
|
|
2930093da0 | ||
|
|
b7e63a466b | ||
|
|
a35c92439c | ||
|
|
eed09a7891 |
2
.github/workflows/build-docker.yml
vendored
2
.github/workflows/build-docker.yml
vendored
@@ -21,7 +21,7 @@ jobs:
|
||||
suffix: ""
|
||||
continue-on-error: false
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@v5
|
||||
|
||||
- name: Get version number
|
||||
id: get_version
|
||||
|
||||
4
.github/workflows/ci.yml
vendored
4
.github/workflows/ci.yml
vendored
@@ -12,8 +12,8 @@ jobs:
|
||||
python-version: ["3.12"]
|
||||
node-version: ["22"]
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: awalsh128/cache-apt-pkgs-action@v1.5.1
|
||||
- uses: actions/checkout@v5
|
||||
- uses: awalsh128/cache-apt-pkgs-action@v1.5.3
|
||||
with:
|
||||
packages: libsasl2-dev python3-dev libxml2-dev libxmlsec1-dev libxslt-dev libxmlsec1-openssl libxslt-dev libldap2-dev libssl-dev gcc musl-dev postgresql-dev zlib-dev jpeg-dev libwebp-dev openssl-dev libffi-dev cargo openldap-dev python3-dev xmlsec-dev xmlsec build-base g++ curl
|
||||
version: 1.0
|
||||
|
||||
2
.github/workflows/codeql-analysis.yml
vendored
2
.github/workflows/codeql-analysis.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v5
|
||||
with:
|
||||
# We must fetch at least the immediate parents so that if this is
|
||||
# a pull request then we can checkout the head.
|
||||
|
||||
2
.github/workflows/docs.yml
vendored
2
.github/workflows/docs.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
if: github.repository_owner == 'TandoorRecipes' && ${{ github.event.workflow_run.conclusion == 'success' }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.x
|
||||
|
||||
@@ -30,9 +30,11 @@
|
||||

|
||||
|
||||
## Core Features
|
||||
|
||||
- 🥗 **Manage your recipes** - Manage your ever growing recipe collection
|
||||
- 📆 **Plan** - multiple meals for each day
|
||||
- 🛒 **Shopping lists** - via the meal plan or straight from recipes
|
||||
- 🪄 **use AI** to recognize images, sort recipe steps, find nutrition facts and more
|
||||
- 📚 **Cookbooks** - collect recipes into books
|
||||
- 👪 **Share and collaborate** on recipes with friends and family
|
||||
|
||||
@@ -62,12 +64,13 @@ a public page.
|
||||
|
||||
Documentation can be found [here](https://docs.tandoor.dev/).
|
||||
|
||||
## Support our work
|
||||
## ❤️ Support our work ❤️
|
||||
Tandoor is developed by volunteers in their free time just because its fun. That said earning
|
||||
some money with the project allows us to spend more time on it and thus make improvements we otherwise couldn't.
|
||||
Because of that there are several ways you can support us
|
||||
|
||||
- **GitHub Sponsors** You can sponsor contributors of this project on GitHub: [vabene1111](https://github.com/sponsors/vabene1111)
|
||||
- **Patron** You can sponsor contributors of this project on Patron: [vabene111](https://www.patreon.com/cw/vabene1111)
|
||||
- **Host at Hetzner** We have been very happy customers of Hetzner for multiple years for all of our projects. If you want to get into self-hosting or are tired of the expensive big providers, their cloud servers are a great place to get started. When you sign up via our [referral link](https://hetzner.cloud/?ref=ISdlrLmr9kGj) you will get 20€ worth of cloud credits and we get a small kickback too.
|
||||
- **Let us host for you** We are offering a [hosted version](https://app.tandoor.dev) where all profits support us and the development of tandoor (currently only available in germany).
|
||||
|
||||
|
||||
17
boot.sh
17
boot.sh
@@ -22,6 +22,14 @@ display_warning() {
|
||||
echo -e "$1"
|
||||
}
|
||||
|
||||
# prepare nginx config
|
||||
envsubst '$MEDIA_ROOT $STATIC_ROOT $TANDOOR_PORT' < /opt/recipes/http.d/Recipes.conf.template > /opt/recipes/http.d/Recipes.conf
|
||||
|
||||
# start nginx early to display error pages
|
||||
echo "Starting nginx"
|
||||
nginx
|
||||
|
||||
|
||||
echo "Checking configuration..."
|
||||
|
||||
# SECRET_KEY (or a valid file at SECRET_KEY_FILE) must be set in .env file
|
||||
@@ -93,7 +101,7 @@ fi
|
||||
|
||||
echo "Collecting static files, this may take a while..."
|
||||
|
||||
python manage.py collectstatic --noinput
|
||||
python manage.py collectstatic --noinput --clear
|
||||
|
||||
echo "Done"
|
||||
|
||||
@@ -101,13 +109,6 @@ chmod -R 755 ${MEDIA_ROOT:-/opt/recipes/mediafiles}
|
||||
|
||||
ipv6_disable=$(cat /sys/module/ipv6/parameters/disable)
|
||||
|
||||
# prepare nginx config
|
||||
envsubst '$MEDIA_ROOT $STATIC_ROOT $TANDOOR_PORT' < /opt/recipes/http.d/Recipes.conf.template > /opt/recipes/http.d/Recipes.conf
|
||||
|
||||
# start nginx
|
||||
echo "Starting nginx"
|
||||
nginx
|
||||
|
||||
echo "Starting gunicorn"
|
||||
# Check if IPv6 is enabled, only then run gunicorn with ipv6 support
|
||||
if [ "$ipv6_disable" -eq 0 ]; then
|
||||
|
||||
@@ -91,8 +91,8 @@ admin.site.register(SearchPreference, SearchPreferenceAdmin)
|
||||
|
||||
|
||||
class AiProviderAdmin(admin.ModelAdmin):
|
||||
list_display = ('name', 'space', 'model',)
|
||||
search_fields = ('name', 'space', 'model',)
|
||||
list_display = ('name', 'space', 'model_name',)
|
||||
search_fields = ('name', 'space', 'model_name',)
|
||||
|
||||
|
||||
admin.site.register(AiProvider, AiProviderAdmin)
|
||||
|
||||
@@ -26,6 +26,7 @@ class ImportExportBase(forms.Form):
|
||||
PAPRIKA = 'PAPRIKA'
|
||||
NEXTCLOUD = 'NEXTCLOUD'
|
||||
MEALIE = 'MEALIE'
|
||||
MEALIE1 = 'MEALIE1'
|
||||
CHOWDOWN = 'CHOWDOWN'
|
||||
SAFFRON = 'SAFFRON'
|
||||
CHEFTAP = 'CHEFTAP'
|
||||
@@ -46,7 +47,7 @@ class ImportExportBase(forms.Form):
|
||||
PDF = 'PDF'
|
||||
GOURMET = 'GOURMET'
|
||||
|
||||
type = forms.ChoiceField(choices=((DEFAULT, _('Default')), (PAPRIKA, 'Paprika'), (NEXTCLOUD, 'Nextcloud Cookbook'), (MEALIE, 'Mealie'), (CHOWDOWN, 'Chowdown'),
|
||||
type = forms.ChoiceField(choices=((DEFAULT, _('Default')), (PAPRIKA, 'Paprika'), (NEXTCLOUD, 'Nextcloud Cookbook'), (MEALIE, 'Mealie'), (MEALIE1, 'Mealie1'), (CHOWDOWN, 'Chowdown'),
|
||||
(SAFFRON, 'Saffron'), (CHEFTAP, 'ChefTap'), (PEPPERPLATE, 'Pepperplate'), (RECETTETEK, 'RecetteTek'), (RECIPESAGE, 'Recipe Sage'),
|
||||
(DOMESTICA, 'Domestica'), (MEALMASTER, 'MealMaster'), (REZKONV, 'RezKonv'), (OPENEATS, 'Openeats'), (RECIPEKEEPER, 'Recipe Keeper'),
|
||||
(PLANTOEAT, 'Plantoeat'), (COOKBOOKAPP, 'CookBookApp'), (COPYMETHAT, 'CopyMeThat'), (PDF, 'PDF'), (MELARECIPES, 'Melarecipes'),
|
||||
@@ -75,6 +76,11 @@ class ImportForm(ImportExportBase):
|
||||
files = MultipleFileField(required=True)
|
||||
duplicates = forms.BooleanField(help_text=_('To prevent duplicates recipes with the same name as existing ones are ignored. Check this box to import everything.'),
|
||||
required=False)
|
||||
meal_plans = forms.BooleanField(required=False)
|
||||
shopping_lists = forms.BooleanField(required=False)
|
||||
nutrition_per_serving = forms.BooleanField(required=False) # some managers (e.g. mealie) do not specify what the nutrition's relate to so we let the user choose
|
||||
|
||||
|
||||
class ExportForm(ImportExportBase):
|
||||
recipes = forms.ModelMultipleChoiceField(widget=MultiSelectWidget, queryset=Recipe.objects.none(), required=False)
|
||||
all = forms.BooleanField(required=False)
|
||||
|
||||
@@ -5,6 +5,7 @@ from django.db.models import Sum
|
||||
from litellm import CustomLogger
|
||||
|
||||
from cookbook.models import AiLog
|
||||
from recipes import settings
|
||||
|
||||
|
||||
def get_monthly_token_usage(space):
|
||||
@@ -32,12 +33,14 @@ class AiCallbackHandler(CustomLogger):
|
||||
space = None
|
||||
user = None
|
||||
ai_provider = None
|
||||
function = None
|
||||
|
||||
def __init__(self, space, user, ai_provider):
|
||||
def __init__(self, space, user, ai_provider, function):
|
||||
super().__init__()
|
||||
self.space = space
|
||||
self.user = user
|
||||
self.ai_provider = ai_provider
|
||||
self.function = function
|
||||
|
||||
def log_pre_api_call(self, model, messages, kwargs):
|
||||
pass
|
||||
@@ -61,6 +64,8 @@ class AiCallbackHandler(CustomLogger):
|
||||
remaining_balance = self.space.ai_credits_balance - Decimal(str(credit_cost))
|
||||
if remaining_balance < 0:
|
||||
remaining_balance = 0
|
||||
if settings.HOSTED and self.space.ai_credits_monthly == 0:
|
||||
self.space.ai_enabled = False
|
||||
|
||||
self.space.ai_credits_balance = remaining_balance
|
||||
credits_from_balance = True
|
||||
@@ -74,7 +79,7 @@ class AiCallbackHandler(CustomLogger):
|
||||
end_time=end_time,
|
||||
input_tokens=response_obj['usage']['prompt_tokens'],
|
||||
output_tokens=response_obj['usage']['completion_tokens'],
|
||||
function=AiLog.F_FILE_IMPORT,
|
||||
function=self.function,
|
||||
credit_cost=credit_cost,
|
||||
credits_from_balance=credits_from_balance,
|
||||
)
|
||||
|
||||
@@ -51,10 +51,10 @@ class OpenDataImporter:
|
||||
for field in field_list:
|
||||
if isinstance(getattr(obj, field), float) or isinstance(getattr(obj, field), Decimal):
|
||||
if abs(float(getattr(obj, field)) - float(existing_obj[field])) > 0.001: # convert both to float and check if basically equal
|
||||
print(f'comparing FLOAT {obj} failed because field {field} is not equal ({getattr(obj, field)} != {existing_obj[field]})')
|
||||
#print(f'comparing FLOAT {obj} failed because field {field} is not equal ({getattr(obj, field)} != {existing_obj[field]})')
|
||||
return False
|
||||
elif getattr(obj, field) != existing_obj[field]:
|
||||
print(f'comparing {obj} failed because field {field} is not equal ({getattr(obj, field)} != {existing_obj[field]})')
|
||||
#print(f'comparing {obj} failed because field {field} is not equal ({getattr(obj, field)} != {existing_obj[field]})')
|
||||
return False
|
||||
return True
|
||||
|
||||
@@ -342,7 +342,7 @@ class OpenDataImporter:
|
||||
'name': self.data[datatype][k]['name'],
|
||||
'plural_name': self.data[datatype][k]['plural_name'] if self.data[datatype][k]['plural_name'] != '' else None,
|
||||
'supermarket_category_id': self.slug_id_cache['category'][self.data[datatype][k]['store_category']] if self.data[datatype][k]['store_category'] in self.slug_id_cache['category'] else None,
|
||||
'fdc_id': re.sub(r'\D', '', self.data[datatype][k]['fdc_id']) if self.data[datatype][k]['fdc_id'] != '' else None,
|
||||
'fdc_id': re.sub(r'\D', '', str(self.data[datatype][k]['fdc_id'])) if self.data[datatype][k]['fdc_id'] != '' else None,
|
||||
'open_data_slug': k,
|
||||
'properties_food_unit_id': None,
|
||||
'space_id': self.request.space.id,
|
||||
|
||||
@@ -3,17 +3,19 @@ import inspect
|
||||
from django.conf import settings
|
||||
from django.contrib import messages
|
||||
from django.contrib.auth.decorators import user_passes_test
|
||||
from django.contrib.auth.models import Group
|
||||
from django.core.cache import cache
|
||||
from django.core.exceptions import ObjectDoesNotExist, ValidationError
|
||||
from django.http import HttpResponseRedirect
|
||||
from django.urls import reverse, reverse_lazy
|
||||
from django.utils.translation import gettext as _
|
||||
from django_scopes import scopes_disabled
|
||||
from oauth2_provider.contrib.rest_framework import TokenHasReadWriteScope, TokenHasScope
|
||||
from oauth2_provider.models import AccessToken
|
||||
from rest_framework import permissions
|
||||
from rest_framework.permissions import SAFE_METHODS
|
||||
|
||||
from cookbook.models import Recipe, ShareLink, UserSpace
|
||||
import random
|
||||
from cookbook.models import Recipe, ShareLink, UserSpace, Space
|
||||
|
||||
|
||||
def get_allowed_groups(groups_required):
|
||||
@@ -330,6 +332,7 @@ class CustomRecipePermission(permissions.BasePermission):
|
||||
return ((has_group_permission(request.user, ['guest']) and request.method in SAFE_METHODS)
|
||||
or has_group_permission(request.user, ['user'])) and obj.space == request.space
|
||||
|
||||
|
||||
class CustomAiProviderPermission(permissions.BasePermission):
|
||||
"""
|
||||
Custom permission class for the AiProvider api endpoint
|
||||
@@ -455,3 +458,36 @@ class IsReadOnlyDRF(permissions.BasePermission):
|
||||
|
||||
def has_permission(self, request, view):
|
||||
return request.method in SAFE_METHODS
|
||||
|
||||
|
||||
class IsCreateDRF(permissions.BasePermission):
|
||||
message = 'You cannot interact with this object, you can only create'
|
||||
|
||||
def has_permission(self, request, view):
|
||||
return request.method == 'POST'
|
||||
|
||||
|
||||
def create_space_for_user(user, name=None):
|
||||
with scopes_disabled():
|
||||
if not name:
|
||||
name = f"{user.username}'s Space"
|
||||
|
||||
if Space.objects.filter(name=name).exists():
|
||||
name = f'{name} #{random.randrange(1, 10 ** 5)}'
|
||||
|
||||
created_space = Space(name=name,
|
||||
created_by=user,
|
||||
max_file_storage_mb=settings.SPACE_DEFAULT_MAX_FILES,
|
||||
max_recipes=settings.SPACE_DEFAULT_MAX_RECIPES,
|
||||
max_users=settings.SPACE_DEFAULT_MAX_USERS,
|
||||
allow_sharing=settings.SPACE_DEFAULT_ALLOW_SHARING,
|
||||
ai_enabled=settings.SPACE_AI_ENABLED,
|
||||
ai_credits_monthly=settings.SPACE_AI_CREDITS_MONTHLY,
|
||||
space_setup_completed=False, )
|
||||
created_space.save()
|
||||
|
||||
UserSpace.objects.filter(user=user).update(active=False)
|
||||
user_space = UserSpace.objects.create(space=created_space, user=user, active=True)
|
||||
user_space.groups.add(Group.objects.filter(name='admin').get())
|
||||
|
||||
return user_space
|
||||
|
||||
@@ -288,7 +288,7 @@ class RecipeSearch():
|
||||
|
||||
def _updated_on_filter(self):
|
||||
if self._updatedon:
|
||||
self._queryset = self._queryset.filter(updated_at__date__date=self._updatedon)
|
||||
self._queryset = self._queryset.filter(updated_at__date=self._updatedon)
|
||||
elif self._updatedon_lte:
|
||||
self._queryset = self._queryset.filter(updated_at__date__lte=self._updatedon_lte)
|
||||
elif self._updatedon_gte:
|
||||
|
||||
@@ -155,7 +155,7 @@ def get_from_scraper(scrape, request):
|
||||
|
||||
# assign steps
|
||||
try:
|
||||
for i in parse_instructions(scrape.instructions()):
|
||||
for i in parse_instructions(scrape.instructions_list()):
|
||||
recipe_json['steps'].append({
|
||||
'instruction': i,
|
||||
'ingredients': [],
|
||||
@@ -177,11 +177,11 @@ def get_from_scraper(scrape, request):
|
||||
for x in scrape.ingredients():
|
||||
if x.strip() != '':
|
||||
try:
|
||||
amount, unit, ingredient, note = ingredient_parser.parse(x)
|
||||
amount, unit, food, note = ingredient_parser.parse(x)
|
||||
ingredient = {
|
||||
'amount': amount,
|
||||
'food': {
|
||||
'name': ingredient,
|
||||
'name': food,
|
||||
},
|
||||
'unit': None,
|
||||
'note': note,
|
||||
@@ -315,14 +315,29 @@ def clean_instruction_string(instruction):
|
||||
# handle unsupported, special UTF8 character in Thermomix-specific instructions,
|
||||
# that happen in nearly every recipe on Cookidoo, Zaubertopf Club, Rezeptwelt
|
||||
# and in Thermomix-specific recipes on many other sites
|
||||
return normalized_string \
|
||||
.replace("", _('reverse rotation')) \
|
||||
.replace("", _('careful rotation')) \
|
||||
.replace("", _('knead')) \
|
||||
.replace("Andicken ", _('thicken')) \
|
||||
.replace("Erwärmen ", _('warm up')) \
|
||||
.replace("Fermentieren ", _('ferment')) \
|
||||
.replace("Sous-vide ", _("sous-vide"))
|
||||
normalized_string = normalized_string \
|
||||
.replace(u"\uE003", _('reverse rotation')) \
|
||||
.replace(u"\uE002", _('careful rotation')) \
|
||||
.replace(u"\uE001", _('knead')) \
|
||||
.replace(u"\uE031", _('thicken')) \
|
||||
.replace(u"\uE019", _('warm up')) \
|
||||
.replace(u"\uE02E", _('ferment')) \
|
||||
.replace(u"\uE018", _('slow cook')) \
|
||||
.replace(u"\uE033", _('egg boiler')) \
|
||||
.replace(u"\uE016", _('kettle')) \
|
||||
.replace(u"\uE01E", _('blend')) \
|
||||
.replace(u"\uE011", _('pre-clean')) \
|
||||
.replace(u"\uE026", _('high temperature')) \
|
||||
.replace(u"\uE00D", _('rice cooker')) \
|
||||
.replace(u"\uE00C", _('caramelize')) \
|
||||
.replace(u"\uE038", _('peeler')) \
|
||||
.replace(u"\uE037", _('slicer')) \
|
||||
.replace(u"\uE036", _('grater')) \
|
||||
.replace(u"\uE04C", _('spiralizer')) \
|
||||
.replace(u"\uE02D", _("sous-vide"))
|
||||
|
||||
|
||||
return normalized_string
|
||||
|
||||
|
||||
def parse_instructions(instructions):
|
||||
@@ -403,6 +418,8 @@ def parse_servings_text(servings):
|
||||
|
||||
|
||||
def parse_time(recipe_time):
|
||||
if not recipe_time:
|
||||
return 0
|
||||
if type(recipe_time) not in [int, float]:
|
||||
try:
|
||||
recipe_time = float(re.search(r'\d+', recipe_time).group())
|
||||
|
||||
@@ -1,8 +1,15 @@
|
||||
from django.contrib.auth.models import Group
|
||||
from django.http import HttpResponseRedirect
|
||||
from django.urls import reverse
|
||||
from django_scopes import scope, scopes_disabled
|
||||
from oauth2_provider.contrib.rest_framework import OAuth2Authentication
|
||||
from psycopg2.errors import UniqueViolation
|
||||
from rest_framework.exceptions import AuthenticationFailed
|
||||
|
||||
import random
|
||||
|
||||
from cookbook.helper.permission_helper import create_space_for_user
|
||||
from cookbook.models import Space, UserSpace
|
||||
from cookbook.views import views
|
||||
from recipes import settings
|
||||
|
||||
@@ -34,16 +41,28 @@ class ScopeMiddleware:
|
||||
if request.path.startswith(prefix + '/switch-space/'):
|
||||
return self.get_response(request)
|
||||
|
||||
with scopes_disabled():
|
||||
if request.user.userspace_set.count() == 0 and not reverse('account_logout') in request.path:
|
||||
return views.space_overview(request)
|
||||
if request.path.startswith(prefix + '/invite/'):
|
||||
return self.get_response(request)
|
||||
|
||||
# get active user space, if for some reason more than one space is active select first (group permission checks will fail, this is not intended at this point)
|
||||
user_space = request.user.userspace_set.filter(active=True).first()
|
||||
|
||||
if not user_space:
|
||||
return views.space_overview(request)
|
||||
if not user_space and request.user.userspace_set.count() > 0:
|
||||
# if the users has a userspace but nothing is active, activate the first one
|
||||
user_space = request.user.userspace_set.first()
|
||||
if user_space:
|
||||
user_space.active = True
|
||||
user_space.save()
|
||||
|
||||
if not user_space:
|
||||
if 'signup_token' in request.session:
|
||||
# if user is authenticated, has no space but a signup token (InviteLink) is present, redirect to invite link logic
|
||||
return HttpResponseRedirect(reverse('view_invite', args=[request.session.pop('signup_token', '')]))
|
||||
else:
|
||||
# if user does not yet have a space create one for him
|
||||
user_space = create_space_for_user(request.user)
|
||||
|
||||
# TODO remove the need for this view
|
||||
if user_space.groups.count() == 0 and not reverse('account_logout') in request.path:
|
||||
return views.no_groups(request)
|
||||
|
||||
|
||||
@@ -26,6 +26,12 @@ class Integration:
|
||||
files = None
|
||||
export_type = None
|
||||
ignored_recipes = []
|
||||
import_log = None
|
||||
import_duplicates = False
|
||||
|
||||
import_meal_plans = True
|
||||
import_shopping_lists = True
|
||||
nutrition_per_serving = False
|
||||
|
||||
def __init__(self, request, export_type):
|
||||
"""
|
||||
@@ -102,7 +108,7 @@ class Integration:
|
||||
"""
|
||||
return True
|
||||
|
||||
def do_import(self, files, il, import_duplicates):
|
||||
def do_import(self, files, il, import_duplicates, meal_plans=True, shopping_lists=True, nutrition_per_serving=False):
|
||||
"""
|
||||
Imports given files
|
||||
:param import_duplicates: if true duplicates are imported as well
|
||||
@@ -111,6 +117,12 @@ class Integration:
|
||||
:return: HttpResponseRedirect to the recipe search showing all imported recipes
|
||||
"""
|
||||
with scope(space=self.request.space):
|
||||
self.import_log = il
|
||||
self.import_duplicates = import_duplicates
|
||||
|
||||
self.import_meal_plans = meal_plans
|
||||
self.import_shopping_lists = shopping_lists
|
||||
self.nutrition_per_serving = nutrition_per_serving
|
||||
|
||||
try:
|
||||
self.files = files
|
||||
@@ -166,20 +178,24 @@ class Integration:
|
||||
il.total_recipes = len(new_file_list)
|
||||
file_list = new_file_list
|
||||
|
||||
for z in file_list:
|
||||
try:
|
||||
if not hasattr(z, 'filename') or isinstance(z, Tag):
|
||||
recipe = self.get_recipe_from_file(z)
|
||||
else:
|
||||
recipe = self.get_recipe_from_file(BytesIO(import_zip.read(z.filename)))
|
||||
recipe.keywords.add(self.keyword)
|
||||
il.msg += self.get_recipe_processed_msg(recipe)
|
||||
self.handle_duplicates(recipe, import_duplicates)
|
||||
il.imported_recipes += 1
|
||||
il.save()
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
self.handle_exception(e, log=il, message=f'-------------------- \nERROR \n{e}\n--------------------\n')
|
||||
if isinstance(self, cookbook.integration.mealie1.Mealie1):
|
||||
# since the mealie 1.0 export is a backup and not a classic recipe export we treat it a bit differently
|
||||
recipes = self.get_recipe_from_file(import_zip)
|
||||
else:
|
||||
for z in file_list:
|
||||
try:
|
||||
if not hasattr(z, 'filename') or isinstance(z, Tag):
|
||||
recipe = self.get_recipe_from_file(z)
|
||||
else:
|
||||
recipe = self.get_recipe_from_file(BytesIO(import_zip.read(z.filename)))
|
||||
recipe.keywords.add(self.keyword)
|
||||
il.msg += self.get_recipe_processed_msg(recipe)
|
||||
self.handle_duplicates(recipe, import_duplicates)
|
||||
il.imported_recipes += 1
|
||||
il.save()
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
self.handle_exception(e, log=il, message=f'-------------------- \nERROR \n{e}\n--------------------\n')
|
||||
import_zip.close()
|
||||
elif '.json' in f['name'] or '.xml' in f['name'] or '.txt' in f['name'] or '.mmf' in f['name'] or '.rk' in f['name'] or '.melarecipe' in f['name']:
|
||||
data_list = self.split_recipe_file(f['file'])
|
||||
|
||||
352
cookbook/integration/mealie1.py
Normal file
352
cookbook/integration/mealie1.py
Normal file
@@ -0,0 +1,352 @@
|
||||
import json
|
||||
import re
|
||||
import traceback
|
||||
import uuid
|
||||
from decimal import Decimal
|
||||
from io import BytesIO
|
||||
from zipfile import ZipFile
|
||||
from gettext import gettext as _
|
||||
|
||||
from django.db import transaction
|
||||
|
||||
from cookbook.helper import ingredient_parser
|
||||
from cookbook.helper.image_processing import get_filetype
|
||||
from cookbook.helper.ingredient_parser import IngredientParser
|
||||
from cookbook.helper.recipe_url_import import parse_servings, parse_servings_text, parse_time
|
||||
from cookbook.integration.integration import Integration
|
||||
from cookbook.models import Ingredient, Keyword, Recipe, Step, Food, Unit, SupermarketCategory, PropertyType, Property, MealType, MealPlan, CookLog, ShoppingListEntry
|
||||
|
||||
|
||||
class Mealie1(Integration):
|
||||
"""
|
||||
integration for mealie past version 1.0
|
||||
"""
|
||||
|
||||
def get_recipe_from_file(self, file):
|
||||
mealie_database = json.loads(BytesIO(file.read('database.json')).getvalue().decode("utf-8"))
|
||||
self.import_log.total_recipes = len(mealie_database['recipes'])
|
||||
self.import_log.msg += f"Importing {len(mealie_database["categories"]) + len(mealie_database["tags"])} tags and categories as keywords...\n"
|
||||
self.import_log.save()
|
||||
|
||||
keywords_categories_dict = {}
|
||||
for c in mealie_database['categories']:
|
||||
if keyword := Keyword.objects.filter(name=c['name'], space=self.request.space).first():
|
||||
keywords_categories_dict[c['id']] = keyword.pk
|
||||
else:
|
||||
keyword = Keyword.objects.create(name=c['name'], space=self.request.space)
|
||||
keywords_categories_dict[c['id']] = keyword.pk
|
||||
|
||||
keywords_tags_dict = {}
|
||||
for t in mealie_database['tags']:
|
||||
if keyword := Keyword.objects.filter(name=t['name'], space=self.request.space).first():
|
||||
keywords_tags_dict[t['id']] = keyword.pk
|
||||
else:
|
||||
keyword = Keyword.objects.create(name=t['name'], space=self.request.space)
|
||||
keywords_tags_dict[t['id']] = keyword.pk
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["multi_purpose_labels"])} multi purpose labels as supermarket categories...\n"
|
||||
self.import_log.save()
|
||||
|
||||
supermarket_categories_dict = {}
|
||||
for m in mealie_database['multi_purpose_labels']:
|
||||
if supermarket_category := SupermarketCategory.objects.filter(name=m['name'], space=self.request.space).first():
|
||||
supermarket_categories_dict[m['id']] = supermarket_category.pk
|
||||
else:
|
||||
supermarket_category = SupermarketCategory.objects.create(name=m['name'], space=self.request.space)
|
||||
supermarket_categories_dict[m['id']] = supermarket_category.pk
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["ingredient_foods"])} foods...\n"
|
||||
self.import_log.save()
|
||||
|
||||
foods_dict = {}
|
||||
for f in mealie_database['ingredient_foods']:
|
||||
if food := Food.objects.filter(name=f['name'], space=self.request.space).first():
|
||||
foods_dict[f['id']] = food.pk
|
||||
else:
|
||||
food = {'name': f['name'],
|
||||
'plural_name': f['plural_name'],
|
||||
'description': f['description'],
|
||||
'space': self.request.space}
|
||||
|
||||
if f['label_id'] and f['label_id'] in supermarket_categories_dict:
|
||||
food['supermarket_category_id'] = supermarket_categories_dict[f['label_id']]
|
||||
|
||||
food = Food.objects.create(**food)
|
||||
if f['on_hand']:
|
||||
food.onhand_users.add(self.request.user)
|
||||
foods_dict[f['id']] = food.pk
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["ingredient_units"])} units...\n"
|
||||
self.import_log.save()
|
||||
|
||||
units_dict = {}
|
||||
for u in mealie_database['ingredient_units']:
|
||||
if unit := Unit.objects.filter(name=u['name'], space=self.request.space).first():
|
||||
units_dict[u['id']] = unit.pk
|
||||
else:
|
||||
unit = Unit.objects.create(name=u['name'], plural_name=u['plural_name'], description=u['description'], space=self.request.space)
|
||||
units_dict[u['id']] = unit.pk
|
||||
|
||||
recipes_dict = {}
|
||||
recipe_property_factor_dict = {}
|
||||
recipes = []
|
||||
recipe_keyword_relation = []
|
||||
for r in mealie_database['recipes']:
|
||||
if Recipe.objects.filter(space=self.request.space, name=r['name']).exists() and not self.import_duplicates:
|
||||
self.import_log.msg += f"Ignoring {r['name']} because a recipe with this name already exists.\n"
|
||||
self.import_log.save()
|
||||
else:
|
||||
recipe = Recipe.objects.create(
|
||||
waiting_time=parse_time(r['perform_time']),
|
||||
working_time=parse_time(r['prep_time']),
|
||||
description=r['description'][:512],
|
||||
name=r['name'],
|
||||
source_url=r['org_url'],
|
||||
servings=r['recipe_servings'] if r['recipe_servings'] and r['recipe_servings'] != 0 else 1,
|
||||
servings_text=r['recipe_yield'].strip() if r['recipe_yield'] else "",
|
||||
internal=True,
|
||||
created_at=r['created_at'],
|
||||
space=self.request.space,
|
||||
created_by=self.request.user,
|
||||
)
|
||||
|
||||
if not self.nutrition_per_serving:
|
||||
recipe_property_factor_dict[r['id']] = recipe.servings
|
||||
|
||||
self.import_log.msg += self.get_recipe_processed_msg(recipe)
|
||||
self.import_log.imported_recipes += 1
|
||||
self.import_log.save()
|
||||
|
||||
recipes.append(recipe)
|
||||
recipes_dict[r['id']] = recipe.pk
|
||||
recipe_keyword_relation.append(Recipe.keywords.through(recipe_id=recipe.pk, keyword_id=self.keyword.pk))
|
||||
|
||||
Recipe.keywords.through.objects.bulk_create(recipe_keyword_relation, ignore_conflicts=True)
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["recipe_instructions"])} instructions...\n"
|
||||
self.import_log.save()
|
||||
|
||||
steps_relation = []
|
||||
first_step_of_recipe_dict = {}
|
||||
for s in mealie_database['recipe_instructions']:
|
||||
if s['recipe_id'] in recipes_dict:
|
||||
step = Step.objects.create(instruction=(s['text'] if s['text'] else "") + (f" \n {s['summary']}" if s['summary'] else ""),
|
||||
order=s['position'],
|
||||
name=s['title'],
|
||||
space=self.request.space)
|
||||
steps_relation.append(Recipe.steps.through(recipe_id=recipes_dict[s['recipe_id']], step_id=step.pk))
|
||||
if s['recipe_id'] not in first_step_of_recipe_dict:
|
||||
first_step_of_recipe_dict[s['recipe_id']] = step.pk
|
||||
|
||||
# it is possible for a recipe to not have steps but have ingredients, in that case create an empty step to add them to later
|
||||
for r in recipes_dict.keys():
|
||||
if r not in first_step_of_recipe_dict:
|
||||
step = Step.objects.create(instruction='',
|
||||
order=0,
|
||||
name='',
|
||||
space=self.request.space)
|
||||
steps_relation.append(Recipe.steps.through(recipe_id=recipes_dict[r], step_id=step.pk))
|
||||
first_step_of_recipe_dict[r] = step.pk
|
||||
|
||||
for n in mealie_database['notes']:
|
||||
if n['recipe_id'] in recipes_dict:
|
||||
step = Step.objects.create(instruction=n['text'],
|
||||
name=n['title'],
|
||||
order=100,
|
||||
space=self.request.space)
|
||||
steps_relation.append(Recipe.steps.through(recipe_id=recipes_dict[n['recipe_id']], step_id=step.pk))
|
||||
|
||||
Recipe.steps.through.objects.bulk_create(steps_relation)
|
||||
|
||||
ingredient_parser = IngredientParser(self.request, True)
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["recipes_ingredients"])} ingredients...\n"
|
||||
self.import_log.save()
|
||||
|
||||
ingredients_relation = []
|
||||
for i in mealie_database['recipes_ingredients']:
|
||||
if i['recipe_id'] in recipes_dict:
|
||||
if i['title']:
|
||||
title_ingredient = Ingredient.objects.create(
|
||||
note=i['title'],
|
||||
is_header=True,
|
||||
space=self.request.space,
|
||||
)
|
||||
ingredients_relation.append(Step.ingredients.through(step_id=first_step_of_recipe_dict[i['recipe_id']], ingredient_id=title_ingredient.pk))
|
||||
if i['food_id']:
|
||||
ingredient = Ingredient.objects.create(
|
||||
food_id=foods_dict[i['food_id']] if i['food_id'] in foods_dict else None,
|
||||
unit_id=units_dict[i['unit_id']] if i['unit_id'] in units_dict else None,
|
||||
original_text=i['original_text'],
|
||||
order=i['position'],
|
||||
amount=i['quantity'] if i['quantity'] else 0,
|
||||
note=i['note'],
|
||||
space=self.request.space,
|
||||
)
|
||||
ingredients_relation.append(Step.ingredients.through(step_id=first_step_of_recipe_dict[i['recipe_id']], ingredient_id=ingredient.pk))
|
||||
elif i['note'].strip():
|
||||
amount, unit, food, note = ingredient_parser.parse(i['note'].strip())
|
||||
f = ingredient_parser.get_food(food)
|
||||
u = ingredient_parser.get_unit(unit)
|
||||
ingredient = Ingredient.objects.create(
|
||||
food=f,
|
||||
unit=u,
|
||||
amount=amount,
|
||||
note=note,
|
||||
original_text=i['original_text'],
|
||||
space=self.request.space,
|
||||
)
|
||||
ingredients_relation.append(Step.ingredients.through(step_id=first_step_of_recipe_dict[i['recipe_id']], ingredient_id=ingredient.pk))
|
||||
Step.ingredients.through.objects.bulk_create(ingredients_relation)
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["recipes_to_categories"]) + len(mealie_database["recipes_to_tags"])} category and keyword relations...\n"
|
||||
self.import_log.save()
|
||||
|
||||
recipe_keyword_relation = []
|
||||
for rC in mealie_database['recipes_to_categories']:
|
||||
if rC['recipe_id'] in recipes_dict:
|
||||
recipe_keyword_relation.append(Recipe.keywords.through(recipe_id=recipes_dict[rC['recipe_id']], keyword_id=keywords_categories_dict[rC['category_id']]))
|
||||
|
||||
for rT in mealie_database['recipes_to_tags']:
|
||||
if rT['recipe_id'] in recipes_dict:
|
||||
recipe_keyword_relation.append(Recipe.keywords.through(recipe_id=recipes_dict[rT['recipe_id']], keyword_id=keywords_tags_dict[rT['tag_id']]))
|
||||
|
||||
Recipe.keywords.through.objects.bulk_create(recipe_keyword_relation, ignore_conflicts=True)
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["recipe_nutrition"])} properties...\n"
|
||||
self.import_log.save()
|
||||
|
||||
property_types_dict = {
|
||||
'calories': PropertyType.objects.get_or_create(name=_('Calories'), space=self.request.space, defaults={'unit': 'kcal', 'fdc_id': 1008})[0],
|
||||
'carbohydrate_content': PropertyType.objects.get_or_create(name=_('Carbohydrates'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1005})[0],
|
||||
'cholesterol_content': PropertyType.objects.get_or_create(name=_('Cholesterol'), space=self.request.space, defaults={'unit': 'mg', 'fdc_id': 1253})[0],
|
||||
'fat_content': PropertyType.objects.get_or_create(name=_('Fat'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1004})[0],
|
||||
'fiber_content': PropertyType.objects.get_or_create(name=_('Fiber'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1079})[0],
|
||||
'protein_content': PropertyType.objects.get_or_create(name=_('Protein'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1003})[0],
|
||||
'saturated_fat_content': PropertyType.objects.get_or_create(name=_('Saturated Fat'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1258})[0],
|
||||
'sodium_content': PropertyType.objects.get_or_create(name=_('Sodium'), space=self.request.space, defaults={'unit': 'mg', 'fdc_id': 1093})[0],
|
||||
'sugar_content': PropertyType.objects.get_or_create(name=_('Sugar'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1063})[0],
|
||||
'trans_fat_content': PropertyType.objects.get_or_create(name=_('Trans Fat'), space=self.request.space, defaults={'unit': 'g', 'fdc_id': 1257})[0],
|
||||
'unsaturated_fat_content': PropertyType.objects.get_or_create(name=_('Unsaturated Fat'), space=self.request.space, defaults={'unit': 'g'})[0],
|
||||
}
|
||||
|
||||
with transaction.atomic():
|
||||
recipe_properties_relation = []
|
||||
properties_relation = []
|
||||
for r in mealie_database['recipe_nutrition']:
|
||||
if r['recipe_id'] in recipes_dict:
|
||||
for key in property_types_dict:
|
||||
if r[key]:
|
||||
properties_relation.append(
|
||||
Property(property_type_id=property_types_dict[key].pk,
|
||||
property_amount=Decimal(str(r[key])) / (
|
||||
Decimal(str(recipe_property_factor_dict[r['recipe_id']])) if r['recipe_id'] in recipe_property_factor_dict else 1),
|
||||
open_data_food_slug=r['recipe_id'],
|
||||
space=self.request.space))
|
||||
properties = Property.objects.bulk_create(properties_relation)
|
||||
property_ids = []
|
||||
for p in properties:
|
||||
recipe_properties_relation.append(Recipe.properties.through(recipe_id=recipes_dict[p.open_data_food_slug], property_id=p.pk))
|
||||
property_ids.append(p.pk)
|
||||
Recipe.properties.through.objects.bulk_create(recipe_properties_relation, ignore_conflicts=True)
|
||||
Property.objects.filter(id__in=property_ids).update(open_data_food_slug=None)
|
||||
|
||||
# delete unused property types
|
||||
for pT in property_types_dict:
|
||||
try:
|
||||
property_types_dict[pT].delete()
|
||||
except:
|
||||
pass
|
||||
|
||||
self.import_log.msg += f"Importing {len(mealie_database["recipe_comments"]) + len(mealie_database["recipe_timeline_events"])} comments and cook logs...\n"
|
||||
self.import_log.save()
|
||||
|
||||
cook_log_list = []
|
||||
for c in mealie_database['recipe_comments']:
|
||||
if c['recipe_id'] in recipes_dict:
|
||||
cook_log_list.append(CookLog(
|
||||
recipe_id=recipes_dict[c['recipe_id']],
|
||||
comment=c['text'],
|
||||
created_at=c['created_at'],
|
||||
created_by=self.request.user,
|
||||
space=self.request.space,
|
||||
))
|
||||
|
||||
for c in mealie_database['recipe_timeline_events']:
|
||||
if c['recipe_id'] in recipes_dict:
|
||||
if c['event_type'] == 'comment':
|
||||
cook_log_list.append(CookLog(
|
||||
recipe_id=recipes_dict[c['recipe_id']],
|
||||
comment=c['message'],
|
||||
created_at=c['created_at'],
|
||||
created_by=self.request.user,
|
||||
space=self.request.space,
|
||||
))
|
||||
|
||||
CookLog.objects.bulk_create(cook_log_list)
|
||||
|
||||
if self.import_meal_plans:
|
||||
self.import_log.msg += f"Importing {len(mealie_database["group_meal_plans"])} meal plans...\n"
|
||||
self.import_log.save()
|
||||
|
||||
meal_types_dict = {}
|
||||
meal_plans = []
|
||||
for m in mealie_database['group_meal_plans']:
|
||||
if m['recipe_id'] in recipes_dict:
|
||||
if not m['entry_type'] in meal_types_dict:
|
||||
meal_type = MealType.objects.get_or_create(name=m['entry_type'], created_by=self.request.user, space=self.request.space)[0]
|
||||
meal_types_dict[m['entry_type']] = meal_type.pk
|
||||
meal_plans.append(MealPlan(
|
||||
recipe_id=recipes_dict[m['recipe_id']] if m['recipe_id'] else None,
|
||||
title=m['title'] if m['title'] else "",
|
||||
note=m['text'] if m['text'] else "",
|
||||
from_date=m['date'],
|
||||
to_date=m['date'],
|
||||
meal_type_id=meal_types_dict[m['entry_type']],
|
||||
created_by=self.request.user,
|
||||
space=self.request.space,
|
||||
))
|
||||
|
||||
MealPlan.objects.bulk_create(meal_plans)
|
||||
|
||||
if self.import_shopping_lists:
|
||||
self.import_log.msg += f"Importing {len(mealie_database["shopping_list_items"])} shopping list items...\n"
|
||||
self.import_log.save()
|
||||
|
||||
shopping_list_items = []
|
||||
for sli in mealie_database['shopping_list_items']:
|
||||
if not sli['checked']:
|
||||
if sli['food_id']:
|
||||
shopping_list_items.append(ShoppingListEntry(
|
||||
amount=sli['quantity'],
|
||||
unit_id=units_dict[sli['unit_id']] if sli['unit_id'] else None,
|
||||
food_id=foods_dict[sli['food_id']] if sli['food_id'] else None,
|
||||
created_by=self.request.user,
|
||||
space=self.request.space,
|
||||
))
|
||||
elif not sli['food_id'] and sli['note'].strip():
|
||||
amount, unit, food, note = ingredient_parser.parse(sli['note'].strip())
|
||||
f = ingredient_parser.get_food(food)
|
||||
u = ingredient_parser.get_unit(unit)
|
||||
shopping_list_items.append(ShoppingListEntry(
|
||||
amount=amount,
|
||||
unit=u,
|
||||
food=f,
|
||||
created_by=self.request.user,
|
||||
space=self.request.space,
|
||||
))
|
||||
ShoppingListEntry.objects.bulk_create(shopping_list_items)
|
||||
|
||||
self.import_log.msg += f"Importing Images. This might take some time ...\n"
|
||||
self.import_log.save()
|
||||
for r in mealie_database['recipes']:
|
||||
try:
|
||||
if recipe := Recipe.objects.filter(pk=recipes_dict[r['id']]).first():
|
||||
self.import_recipe_image(recipe, BytesIO(file.read(f'data/recipes/{str(uuid.UUID(str(r['id'])))}/images/original.webp')), filetype='.webp')
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return recipes
|
||||
|
||||
def get_file_from_recipe(self, recipe):
|
||||
raise NotImplementedError('Method not implemented in storage integration')
|
||||
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
BIN
cookbook/locale/hr/LC_MESSAGES/django.mo
Normal file
BIN
cookbook/locale/hr/LC_MESSAGES/django.mo
Normal file
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
18
cookbook/migrations/0228_space_space_setup_completed.py
Normal file
18
cookbook/migrations/0228_space_space_setup_completed.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-10 20:11
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('cookbook', '0001_squashed_0227_space_ai_default_provider_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='space',
|
||||
name='space_setup_completed',
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,26 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-24 17:20
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('cookbook', '0228_space_space_setup_completed'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name='ailog',
|
||||
options={'ordering': ('-created_at',)},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name='aiprovider',
|
||||
options={'ordering': ('id',)},
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='storage',
|
||||
name='token',
|
||||
field=models.CharField(blank=True, max_length=4098, null=True),
|
||||
),
|
||||
]
|
||||
15
cookbook/migrations/0230_auto_20250925_2056.py
Normal file
15
cookbook/migrations/0230_auto_20250925_2056.py
Normal file
@@ -0,0 +1,15 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-25 18:56
|
||||
|
||||
from django.db import migrations
|
||||
from django.contrib.postgres.operations import TrigramExtension, UnaccentExtension
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('cookbook', '0229_alter_ailog_options_alter_aiprovider_options_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
TrigramExtension(),
|
||||
UnaccentExtension(),
|
||||
]
|
||||
@@ -329,6 +329,8 @@ class Space(ExportModelOperationsMixin('space'), models.Model):
|
||||
demo = models.BooleanField(default=False)
|
||||
food_inherit = models.ManyToManyField(FoodInheritField, blank=True)
|
||||
|
||||
space_setup_completed = models.BooleanField(default=True)
|
||||
|
||||
ai_enabled = models.BooleanField(default=True)
|
||||
ai_credits_monthly = models.IntegerField(default=100)
|
||||
ai_credits_balance = models.DecimalField(default=0, max_digits=16, decimal_places=4)
|
||||
@@ -415,9 +417,17 @@ class AiProvider(models.Model):
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
class Meta:
|
||||
ordering = ('id',)
|
||||
|
||||
|
||||
class AiLog(models.Model, PermissionModelMixin):
|
||||
F_FILE_IMPORT = 'FILE_IMPORT'
|
||||
F_STEP_SORT = 'STEP_SORT'
|
||||
F_FOOD_PROPERTIES = 'FOOD_PROPERTIES'
|
||||
|
||||
ai_provider = models.ForeignKey(AiProvider, on_delete=models.SET_NULL, null=True)
|
||||
function = models.CharField(max_length=64)
|
||||
@@ -435,6 +445,12 @@ class AiLog(models.Model, PermissionModelMixin):
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.function} {self.ai_provider.name} {self.created_at}"
|
||||
|
||||
class Meta:
|
||||
ordering = ('-created_at',)
|
||||
|
||||
|
||||
class ConnectorConfig(models.Model, PermissionModelMixin):
|
||||
HOMEASSISTANT = 'HomeAssistant'
|
||||
@@ -576,7 +592,7 @@ class Storage(models.Model, PermissionModelMixin):
|
||||
)
|
||||
username = models.CharField(max_length=128, blank=True, null=True)
|
||||
password = models.CharField(max_length=128, blank=True, null=True)
|
||||
token = models.CharField(max_length=512, blank=True, null=True)
|
||||
token = models.CharField(max_length=4098, blank=True, null=True)
|
||||
url = models.URLField(blank=True, null=True)
|
||||
path = models.CharField(blank=True, default='', max_length=256)
|
||||
created_by = models.ForeignKey(User, on_delete=models.PROTECT)
|
||||
@@ -791,14 +807,7 @@ class Food(ExportModelOperationsMixin('food'), TreeModel, PermissionModelMixin):
|
||||
self.delete()
|
||||
return target
|
||||
|
||||
def delete(self):
|
||||
if self.ingredient_set.all().exclude(step=None).count() > 0:
|
||||
raise ProtectedError(self.name + _(" is part of a recipe step and cannot be deleted"), self.ingredient_set.all().exclude(step=None))
|
||||
else:
|
||||
return super().delete()
|
||||
|
||||
# MP_Tree move uses raw SQL to execute move, override behavior to force a save triggering post_save signal
|
||||
|
||||
def move(self, *args, **kwargs):
|
||||
super().move(*args, **kwargs)
|
||||
# treebeard bypasses ORM, need to explicity save to trigger post save signals retrieve the object again to avoid writing previous state back to disk
|
||||
|
||||
@@ -26,7 +26,7 @@ from cookbook.helper.CustomStorageClass import CachedS3Boto3Storage
|
||||
from cookbook.helper.HelperFunctions import str2bool
|
||||
from cookbook.helper.ai_helper import get_monthly_token_usage
|
||||
from cookbook.helper.image_processing import is_file_type_allowed
|
||||
from cookbook.helper.permission_helper import above_space_limit
|
||||
from cookbook.helper.permission_helper import above_space_limit, create_space_for_user
|
||||
from cookbook.helper.property_helper import FoodPropertyHelper
|
||||
from cookbook.helper.shopping_helper import RecipeShoppingEditor
|
||||
from cookbook.helper.unit_conversion_helper import UnitConversionHelper
|
||||
@@ -151,19 +151,22 @@ class CustomOnHandField(serializers.Field):
|
||||
return instance
|
||||
|
||||
def to_representation(self, obj):
|
||||
if not self.context["request"].user.is_authenticated:
|
||||
try:
|
||||
if not self.context["request"].user.is_authenticated:
|
||||
return []
|
||||
shared_users = []
|
||||
if c := caches['default'].get(f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', None):
|
||||
shared_users = c
|
||||
else:
|
||||
try:
|
||||
shared_users = [x.id for x in list(self.context['request'].user.get_shopping_share())] + [self.context['request'].user.id]
|
||||
caches['default'].set(f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', shared_users, timeout=5 * 60)
|
||||
# TODO ugly hack that improves API performance significantly, should be done properly
|
||||
except AttributeError: # Anonymous users (using share links) don't have shared users
|
||||
pass
|
||||
return obj.onhand_users.filter(id__in=shared_users).exists()
|
||||
except AttributeError:
|
||||
return []
|
||||
shared_users = []
|
||||
if c := caches['default'].get(f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', None):
|
||||
shared_users = c
|
||||
else:
|
||||
try:
|
||||
shared_users = [x.id for x in list(self.context['request'].user.get_shopping_share())] + [self.context['request'].user.id]
|
||||
caches['default'].set(f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', shared_users, timeout=5 * 60)
|
||||
# TODO ugly hack that improves API performance significantly, should be done properly
|
||||
except AttributeError: # Anonymous users (using share links) don't have shared users
|
||||
pass
|
||||
return obj.onhand_users.filter(id__in=shared_users).exists()
|
||||
|
||||
def to_internal_value(self, data):
|
||||
return data
|
||||
@@ -335,17 +338,26 @@ class AiProviderSerializer(serializers.ModelSerializer):
|
||||
return super().create(validated_data)
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
validated_data = self.handle_global_space_logic(validated_data)
|
||||
validated_data = self.handle_global_space_logic(validated_data, instance=instance)
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
def handle_global_space_logic(self, validated_data):
|
||||
def handle_global_space_logic(self, validated_data, instance=None):
|
||||
"""
|
||||
allow superusers to create AI providers without a space but make sure everyone else only uses their own space
|
||||
"""
|
||||
if ('space' not in validated_data or not validated_data['space']) and self.context['request'].user.is_superuser:
|
||||
validated_data['space'] = None
|
||||
if self.context['request'].user.is_superuser:
|
||||
if ('space' not in validated_data or not validated_data['space']):
|
||||
validated_data['space'] = None
|
||||
else:
|
||||
validated_data['space'] = self.context['request'].space
|
||||
else:
|
||||
validated_data['space'] = self.context['request'].space
|
||||
if instance:
|
||||
validated_data['space'] = instance.space
|
||||
else:
|
||||
validated_data['space'] = self.context['request'].space
|
||||
|
||||
if 'log_credit_cost' in validated_data and not self.context['request'].user.is_superuser:
|
||||
del validated_data['log_credit_cost']
|
||||
|
||||
return validated_data
|
||||
|
||||
@@ -367,12 +379,12 @@ class AiLogSerializer(serializers.ModelSerializer):
|
||||
|
||||
class SpaceSerializer(WritableNestedModelSerializer):
|
||||
created_by = UserSerializer(read_only=True)
|
||||
user_count = serializers.SerializerMethodField('get_user_count')
|
||||
recipe_count = serializers.SerializerMethodField('get_recipe_count')
|
||||
file_size_mb = serializers.SerializerMethodField('get_file_size_mb')
|
||||
ai_monthly_credits_used = serializers.SerializerMethodField('get_ai_monthly_credits_used')
|
||||
user_count = serializers.SerializerMethodField('get_user_count', read_only=True)
|
||||
recipe_count = serializers.SerializerMethodField('get_recipe_count', read_only=True)
|
||||
file_size_mb = serializers.SerializerMethodField('get_file_size_mb', read_only=True)
|
||||
ai_monthly_credits_used = serializers.SerializerMethodField('get_ai_monthly_credits_used', read_only=True)
|
||||
ai_default_provider = AiProviderSerializer(required=False, allow_null=True)
|
||||
food_inherit = FoodInheritFieldSerializer(many=True)
|
||||
food_inherit = FoodInheritFieldSerializer(many=True, required=False)
|
||||
image = UserFileViewSerializer(required=False, many=False, allow_null=True)
|
||||
nav_logo = UserFileViewSerializer(required=False, many=False, allow_null=True)
|
||||
custom_space_theme = UserFileViewSerializer(required=False, many=False, allow_null=True)
|
||||
@@ -404,9 +416,26 @@ class SpaceSerializer(WritableNestedModelSerializer):
|
||||
return 0
|
||||
|
||||
def create(self, validated_data):
|
||||
raise ValidationError('Cannot create using this endpoint')
|
||||
if Space.objects.filter(created_by=self.context['request'].user).count() >= self.context['request'].user.userpreference.max_owned_spaces:
|
||||
raise serializers.ValidationError(
|
||||
_('You have the reached the maximum amount of spaces that can be owned by you.') + f' ({self.context['request'].user.userpreference.max_owned_spaces})')
|
||||
|
||||
name = None
|
||||
if 'name' in validated_data:
|
||||
name = validated_data['name']
|
||||
user_space = create_space_for_user(self.context['request'].user, name)
|
||||
return user_space.space
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
validated_data = self.filter_superuser_parameters(validated_data)
|
||||
|
||||
if 'name' in validated_data:
|
||||
if Space.objects.filter(Q(name=validated_data['name']), ~Q(pk=instance.pk)).exists():
|
||||
raise ValidationError(_('Space Name must be unique.'))
|
||||
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
def filter_superuser_parameters(self, validated_data):
|
||||
if 'ai_enabled' in validated_data and not self.context['request'].user.is_superuser:
|
||||
del validated_data['ai_enabled']
|
||||
|
||||
@@ -416,7 +445,7 @@ class SpaceSerializer(WritableNestedModelSerializer):
|
||||
if 'ai_credits_balance' in validated_data and not self.context['request'].user.is_superuser:
|
||||
del validated_data['ai_credits_balance']
|
||||
|
||||
return super().update(instance, validated_data)
|
||||
return validated_data
|
||||
|
||||
class Meta:
|
||||
model = Space
|
||||
@@ -425,7 +454,7 @@ class SpaceSerializer(WritableNestedModelSerializer):
|
||||
'allow_sharing', 'demo', 'food_inherit', 'user_count', 'recipe_count', 'file_size_mb',
|
||||
'image', 'nav_logo', 'space_theme', 'custom_space_theme', 'nav_bg_color', 'nav_text_color',
|
||||
'logo_color_32', 'logo_color_128', 'logo_color_144', 'logo_color_180', 'logo_color_192', 'logo_color_512', 'logo_color_svg', 'ai_credits_monthly',
|
||||
'ai_credits_balance', 'ai_monthly_credits_used', 'ai_enabled', 'ai_default_provider')
|
||||
'ai_credits_balance', 'ai_monthly_credits_used', 'ai_enabled', 'ai_default_provider', 'space_setup_completed')
|
||||
read_only_fields = (
|
||||
'id', 'created_by', 'created_at', 'max_recipes', 'max_file_storage_mb', 'max_users', 'allow_sharing',
|
||||
'demo', 'ai_monthly_credits_used')
|
||||
@@ -817,28 +846,31 @@ class FoodSerializer(UniqueFieldsMixin, WritableNestedModelSerializer, ExtendedR
|
||||
|
||||
@extend_schema_field(bool)
|
||||
def get_substitute_onhand(self, obj):
|
||||
if not self.context["request"].user.is_authenticated:
|
||||
try:
|
||||
if not self.context["request"].user.is_authenticated:
|
||||
return []
|
||||
shared_users = []
|
||||
if c := caches['default'].get(
|
||||
f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', None):
|
||||
shared_users = c
|
||||
else:
|
||||
try:
|
||||
shared_users = [x.id for x in list(self.context['request'].user.get_shopping_share())] + [
|
||||
self.context['request'].user.id]
|
||||
caches['default'].set(
|
||||
f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}',
|
||||
shared_users, timeout=5 * 60)
|
||||
# TODO ugly hack that improves API performance significantly, should be done properly
|
||||
except AttributeError: # Anonymous users (using share links) don't have shared users
|
||||
pass
|
||||
filter = Q(id__in=obj.substitute.all())
|
||||
if obj.substitute_siblings:
|
||||
filter |= Q(path__startswith=obj.path[:Food.steplen * (obj.depth - 1)], depth=obj.depth)
|
||||
if obj.substitute_children:
|
||||
filter |= Q(path__startswith=obj.path, depth__gt=obj.depth)
|
||||
return Food.objects.filter(filter).filter(onhand_users__id__in=shared_users).exists()
|
||||
except AttributeError:
|
||||
return []
|
||||
shared_users = []
|
||||
if c := caches['default'].get(
|
||||
f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}', None):
|
||||
shared_users = c
|
||||
else:
|
||||
try:
|
||||
shared_users = [x.id for x in list(self.context['request'].user.get_shopping_share())] + [
|
||||
self.context['request'].user.id]
|
||||
caches['default'].set(
|
||||
f'shopping_shared_users_{self.context["request"].space.id}_{self.context["request"].user.id}',
|
||||
shared_users, timeout=5 * 60)
|
||||
# TODO ugly hack that improves API performance significantly, should be done properly
|
||||
except AttributeError: # Anonymous users (using share links) don't have shared users
|
||||
pass
|
||||
filter = Q(id__in=obj.substitute.all())
|
||||
if obj.substitute_siblings:
|
||||
filter |= Q(path__startswith=obj.path[:Food.steplen * (obj.depth - 1)], depth=obj.depth)
|
||||
if obj.substitute_children:
|
||||
filter |= Q(path__startswith=obj.path, depth__gt=obj.depth)
|
||||
return Food.objects.filter(filter).filter(onhand_users__id__in=shared_users).exists()
|
||||
|
||||
def create(self, validated_data):
|
||||
name = validated_data['name'].strip()
|
||||
@@ -1692,6 +1724,11 @@ class FdcQuerySerializer(serializers.Serializer):
|
||||
foods = FdcQueryFoodsSerializer(many=True)
|
||||
|
||||
|
||||
class GenericModelReferenceSerializer(serializers.Serializer):
|
||||
id = serializers.IntegerField()
|
||||
model = serializers.CharField()
|
||||
name = serializers.CharField()
|
||||
|
||||
# Export/Import Serializers
|
||||
|
||||
class KeywordExportSerializer(KeywordSerializer):
|
||||
|
||||
@@ -41,6 +41,12 @@ def test_list_space(obj_1, obj_2, u1_s1, u1_s2, space_2):
|
||||
assert json.loads(u1_s1.get(reverse(LIST_URL)).content)['count'] == 1
|
||||
assert json.loads(u1_s2.get(reverse(LIST_URL)).content)['count'] == 2
|
||||
|
||||
obj_1.space = None
|
||||
obj_1.save()
|
||||
|
||||
assert json.loads(u1_s1.get(reverse(LIST_URL)).content)['count'] == 2
|
||||
assert json.loads(u1_s2.get(reverse(LIST_URL)).content)['count'] == 2
|
||||
|
||||
|
||||
@pytest.mark.parametrize("arg", [
|
||||
['a_u', 403],
|
||||
|
||||
@@ -236,42 +236,6 @@ def test_delete(u1_s1, u1_s2, obj_1, obj_tree_1):
|
||||
assert Food.find_problems() == ([], [], [], [], [])
|
||||
|
||||
|
||||
def test_integrity(u1_s1, recipe_1_s1):
|
||||
with scopes_disabled():
|
||||
assert Food.objects.count() == 10
|
||||
assert Ingredient.objects.count() == 10
|
||||
f_1 = Food.objects.first()
|
||||
|
||||
# deleting food will fail because food is part of recipe
|
||||
r = u1_s1.delete(
|
||||
reverse(
|
||||
DETAIL_URL,
|
||||
args={f_1.id}
|
||||
)
|
||||
)
|
||||
assert r.status_code == 403
|
||||
|
||||
with scopes_disabled():
|
||||
i_1 = f_1.ingredient_set.first()
|
||||
# remove Ingredient that references Food from recipe step
|
||||
i_1.step_set.first().ingredients.remove(i_1)
|
||||
assert Food.objects.count() == 10
|
||||
assert Ingredient.objects.count() == 10
|
||||
|
||||
# deleting food will succeed because its not part of recipe and delete will cascade to Ingredient
|
||||
r = u1_s1.delete(
|
||||
reverse(
|
||||
DETAIL_URL,
|
||||
args={f_1.id}
|
||||
)
|
||||
)
|
||||
assert r.status_code == 204
|
||||
|
||||
with scopes_disabled():
|
||||
assert Food.objects.count() == 9
|
||||
assert Ingredient.objects.count() == 9
|
||||
|
||||
|
||||
def test_move(u1_s1, obj_tree_1, obj_2, obj_3, space_1):
|
||||
with scope(space=space_1):
|
||||
# for some reason the 'path' attribute changes between the factory and the test when using both obj_tree and obj
|
||||
|
||||
@@ -7,6 +7,7 @@ from django.urls import reverse
|
||||
from django_scopes import scopes_disabled
|
||||
|
||||
from cookbook.models import UserSpace
|
||||
from recipes import settings
|
||||
|
||||
LIST_URL = 'api:space-list'
|
||||
DETAIL_URL = 'api:space-detail'
|
||||
@@ -45,7 +46,6 @@ def test_list_multiple(u1_s1, space_1, space_2):
|
||||
assert u1_response['id'] == space_1.id
|
||||
|
||||
|
||||
|
||||
@pytest.mark.parametrize("arg", [
|
||||
['a_u', 403],
|
||||
['g1_s1', 403],
|
||||
@@ -70,9 +70,9 @@ def test_update(arg, request, space_1, a1_s1):
|
||||
|
||||
@pytest.mark.parametrize("arg", [
|
||||
['a_u', 403],
|
||||
['g1_s1', 403],
|
||||
['u1_s1', 403],
|
||||
['a1_s1', 405],
|
||||
['g1_s1', 201],
|
||||
['u1_s1', 201],
|
||||
['a1_s1', 201],
|
||||
])
|
||||
def test_add(arg, request, u1_s2):
|
||||
c = request.getfixturevalue(arg[0])
|
||||
@@ -90,3 +90,59 @@ def test_delete(u1_s1, u1_s2, a1_s1, space_1):
|
||||
# event the space owner cannot delete his space over the api (this might change later but for now it's only available in the UI)
|
||||
r = a1_s1.delete(reverse(DETAIL_URL, args={space_1.id}))
|
||||
assert r.status_code == 405
|
||||
|
||||
|
||||
def test_superuser_parameters(space_1, a1_s1, s1_s1):
|
||||
# ------- test as normal user -------
|
||||
response = a1_s1.post(reverse(LIST_URL), {'name': 'test', 'ai_enabled': not settings.SPACE_AI_ENABLED, 'ai_credits_monthly': settings.SPACE_AI_CREDITS_MONTHLY + 100, 'ai_credits_balance': 100},
|
||||
content_type='application/json')
|
||||
|
||||
assert response.status_code == 201
|
||||
response = json.loads(response.content)
|
||||
assert response['ai_enabled'] == settings.SPACE_AI_ENABLED
|
||||
assert response['ai_credits_monthly'] == settings.SPACE_AI_CREDITS_MONTHLY
|
||||
assert response['ai_credits_balance'] == 0
|
||||
|
||||
space_1.created_by = auth.get_user(a1_s1)
|
||||
space_1.ai_enabled = False
|
||||
space_1.ai_credits_monthly = 0
|
||||
space_1.ai_credits_balance = 0
|
||||
space_1.save()
|
||||
|
||||
response = a1_s1.patch(reverse(DETAIL_URL, args={space_1.id}), {'ai_enabled': True, 'ai_credits_monthly': 100, 'ai_credits_balance': 100},
|
||||
content_type='application/json')
|
||||
|
||||
assert response.status_code == 200
|
||||
|
||||
space_1.refresh_from_db()
|
||||
assert space_1.ai_enabled == False
|
||||
assert space_1.ai_credits_monthly == 0
|
||||
assert space_1.ai_credits_balance == 0
|
||||
|
||||
# ------- test as superuser -------
|
||||
|
||||
response = s1_s1.post(reverse(LIST_URL),
|
||||
{'name': 'test', 'ai_enabled': not settings.SPACE_AI_ENABLED, 'ai_credits_monthly': settings.SPACE_AI_CREDITS_MONTHLY + 100, 'ai_credits_balance': 100},
|
||||
content_type='application/json')
|
||||
|
||||
assert response.status_code == 201
|
||||
response = json.loads(response.content)
|
||||
assert response['ai_enabled'] == settings.SPACE_AI_ENABLED
|
||||
assert response['ai_credits_monthly'] == settings.SPACE_AI_CREDITS_MONTHLY
|
||||
assert response['ai_credits_balance'] == 0
|
||||
|
||||
space_1.created_by = auth.get_user(s1_s1)
|
||||
space_1.ai_enabled = False
|
||||
space_1.ai_credits_monthly = 0
|
||||
space_1.ai_credits_balance = 0
|
||||
space_1.save()
|
||||
|
||||
response = s1_s1.patch(reverse(DETAIL_URL, args={space_1.id}), {'ai_enabled': True, 'ai_credits_monthly': 100, 'ai_credits_balance': 100},
|
||||
content_type='application/json')
|
||||
|
||||
assert response.status_code == 200
|
||||
|
||||
space_1.refresh_from_db()
|
||||
assert space_1.ai_enabled == True
|
||||
assert space_1.ai_credits_monthly == 100
|
||||
assert space_1.ai_credits_balance == 100
|
||||
|
||||
@@ -5,6 +5,8 @@ from django.contrib import auth
|
||||
from django.urls import reverse
|
||||
from django_scopes import scopes_disabled
|
||||
|
||||
from cookbook.models import UserSpace
|
||||
|
||||
LIST_URL = 'api:userspace-list'
|
||||
DETAIL_URL = 'api:userspace-detail'
|
||||
|
||||
@@ -13,10 +15,10 @@ DETAIL_URL = 'api:userspace-detail'
|
||||
['a_u', 403, 0],
|
||||
['g1_s1', 200, 1], # sees only own user space
|
||||
['u1_s1', 200, 1],
|
||||
['a1_s1', 200, 3], # sees user space of all users in space
|
||||
['a2_s1', 200, 1],
|
||||
['a1_s1', 200, 4], # admins can see all other members
|
||||
['a2_s1', 200, 4],
|
||||
])
|
||||
def test_list_permission(arg, request, space_1, g1_s1, u1_s1, a1_s1):
|
||||
def test_list_permission(arg, request, space_1, g1_s1, u1_s1, a1_s1, a2_s1):
|
||||
space_1.created_by = auth.get_user(a1_s1)
|
||||
space_1.save()
|
||||
|
||||
@@ -27,6 +29,18 @@ def test_list_permission(arg, request, space_1, g1_s1, u1_s1, a1_s1):
|
||||
assert len(json.loads(result.content)['results']) == arg[2]
|
||||
|
||||
|
||||
def test_list_all_personal(space_2, u1_s1):
|
||||
result = u1_s1.get(reverse('api:userspace-all-personal'))
|
||||
assert result.status_code == 200
|
||||
assert len(json.loads(result.content)) == 1
|
||||
|
||||
UserSpace.objects.create(user=auth.get_user(u1_s1), space=space_2)
|
||||
|
||||
result = u1_s1.get(reverse('api:userspace-all-personal'))
|
||||
assert result.status_code == 200
|
||||
assert len(json.loads(result.content)) == 2
|
||||
|
||||
|
||||
@pytest.mark.parametrize("arg", [
|
||||
['a_u', 403],
|
||||
['g1_s1', 403],
|
||||
|
||||
@@ -78,10 +78,11 @@ urlpatterns = [
|
||||
|
||||
path('setup/', views.setup, name='view_setup'),
|
||||
path('no-group/', views.no_groups, name='view_no_group'),
|
||||
path('space-overview/', views.space_overview, name='view_space_overview'),
|
||||
path('switch-space/<int:space_id>', views.switch_space, name='view_switch_space'),
|
||||
path('no-perm/', views.no_perm, name='view_no_perm'),
|
||||
#path('space-overview/', views.space_overview, name='view_space_overview'),
|
||||
#path('switch-space/<int:space_id>', views.switch_space, name='view_switch_space'),
|
||||
#path('no-perm/', views.no_perm, name='view_no_perm'),
|
||||
path('invite/<slug:token>', views.invite_link, name='view_invite'),
|
||||
path('invite/<slug:token>/', views.invite_link, name='view_invite'),
|
||||
|
||||
path('system/', views.system, name='view_system'),
|
||||
path('plugin/update/', views.plugin_update, name='view_plugin_update'),
|
||||
@@ -103,6 +104,7 @@ urlpatterns = [
|
||||
path('api/sync_all/', api.sync_all, name='api_sync'),
|
||||
path('api/recipe-from-source/', api.RecipeUrlImportView.as_view(), name='api_recipe_from_source'),
|
||||
path('api/ai-import/', api.AiImportView.as_view(), name='api_ai_import'),
|
||||
path('api/ai-step-sort/', api.AiStepSortView.as_view(), name='api_ai_step_sort'),
|
||||
path('api/import-open-data/', api.ImportOpenData.as_view(), name='api_import_open_data'),
|
||||
path('api/ingredient-from-string/', api.ingredient_from_string, name='api_ingredient_from_string'),
|
||||
path('api/fdc-search/', api.FdcSearchView.as_view(), name='api_fdc_search'),
|
||||
|
||||
@@ -9,6 +9,7 @@ import threading
|
||||
import traceback
|
||||
import uuid
|
||||
from collections import OrderedDict
|
||||
from functools import wraps
|
||||
from json import JSONDecodeError
|
||||
from urllib.parse import unquote
|
||||
from zipfile import ZipFile
|
||||
@@ -18,15 +19,16 @@ import litellm
|
||||
import redis
|
||||
import requests
|
||||
from PIL import UnidentifiedImageError
|
||||
from PIL.ImImagePlugin import number
|
||||
from PIL.features import check
|
||||
from django.contrib import messages
|
||||
from django.contrib.admin.utils import get_deleted_objects, NestedObjects
|
||||
from django.contrib.auth.models import Group, User
|
||||
from django.contrib.postgres.search import TrigramSimilarity
|
||||
from django.core.cache import caches
|
||||
from django.core.exceptions import FieldError, ValidationError
|
||||
from django.core.files import File
|
||||
from django.db.models import Case, Count, Exists, OuterRef, ProtectedError, Q, Subquery, Value, When
|
||||
from django.db import DEFAULT_DB_ALIAS
|
||||
from django.db.models import Case, Count, Exists, OuterRef, ProtectedError, Q, Subquery, Value, When, QuerySet
|
||||
from django.db.models.deletion import Collector
|
||||
from django.db.models.fields.related import ForeignObjectRel
|
||||
from django.db.models.functions import Coalesce, Lower
|
||||
from django.db.models.signals import post_save
|
||||
@@ -35,7 +37,6 @@ from django.shortcuts import get_object_or_404, redirect
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from django.utils.dateparse import parse_datetime
|
||||
from django.utils.datetime_safe import date
|
||||
from django.utils.translation import gettext as _
|
||||
from django_scopes import scopes_disabled
|
||||
from drf_spectacular.types import OpenApiTypes
|
||||
@@ -76,7 +77,7 @@ from cookbook.helper.permission_helper import (CustomIsAdmin, CustomIsOwner, Cus
|
||||
CustomTokenHasScope, CustomUserPermission, IsReadOnlyDRF,
|
||||
above_space_limit,
|
||||
group_required, has_group_permission, is_space_owner,
|
||||
switch_user_active_space, CustomAiProviderPermission
|
||||
switch_user_active_space, CustomAiProviderPermission, IsCreateDRF
|
||||
)
|
||||
from cookbook.helper.recipe_search import RecipeSearch
|
||||
from cookbook.helper.recipe_url_import import clean_dict, get_from_youtube_scraper, get_images_from_soup
|
||||
@@ -113,7 +114,7 @@ from cookbook.serializer import (AccessTokenSerializer, AutomationSerializer, Au
|
||||
LocalizationSerializer, ServerSettingsSerializer, RecipeFromSourceResponseSerializer, ShoppingListEntryBulkCreateSerializer, FdcQuerySerializer,
|
||||
AiImportSerializer, ImportOpenDataSerializer, ImportOpenDataMetaDataSerializer, ImportOpenDataResponseSerializer, ExportRequestSerializer,
|
||||
RecipeImportSerializer, ConnectorConfigSerializer, SearchPreferenceSerializer, SearchFieldsSerializer, RecipeBatchUpdateSerializer,
|
||||
AiProviderSerializer, AiLogSerializer, FoodBatchUpdateSerializer
|
||||
AiProviderSerializer, AiLogSerializer, FoodBatchUpdateSerializer, GenericModelReferenceSerializer
|
||||
)
|
||||
from cookbook.version_info import TANDOOR_VERSION
|
||||
from cookbook.views.import_export import get_integration
|
||||
@@ -134,7 +135,7 @@ class LoggingMixin(object):
|
||||
|
||||
if settings.REDIS_HOST:
|
||||
try:
|
||||
d = date.today().isoformat()
|
||||
d = timezone.now().isoformat()
|
||||
space = request.space
|
||||
endpoint = request.resolver_match.url_name
|
||||
|
||||
@@ -182,7 +183,10 @@ class StandardFilterModelViewSet(viewsets.ModelViewSet):
|
||||
queryset = self.queryset
|
||||
query = self.request.query_params.get('query', None)
|
||||
if query is not None:
|
||||
queryset = queryset.filter(name__icontains=query)
|
||||
try:
|
||||
queryset = queryset.filter(name__icontains=query)
|
||||
except FieldError:
|
||||
pass
|
||||
|
||||
updated_at = self.request.query_params.get('updated_at', None)
|
||||
if updated_at is not None:
|
||||
@@ -511,6 +515,143 @@ class TreeMixin(MergeMixin, FuzzyFilterMixin):
|
||||
return Response(content, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
def paginate(func):
|
||||
"""
|
||||
pagination decorator for custom ViewSet actions
|
||||
"""
|
||||
|
||||
@wraps(func)
|
||||
def inner(self, *args, **kwargs):
|
||||
queryset = func(self, *args, **kwargs)
|
||||
assert isinstance(queryset, (list, QuerySet))
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
class DeleteRelationMixing:
|
||||
"""
|
||||
mixin to add custom API function for model delete dependency checking
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def collect(obj):
|
||||
# collector.nested() nested seems to not include protecting but does include cascading
|
||||
# collector.protected: objects that raise Protected or Restricted error when deleting unit
|
||||
# collector.field_updates: fields that get updated when deleting the unit
|
||||
# collector.model_objs: collects the objects that should be deleted together with the selected unit
|
||||
|
||||
collector = NestedObjects(using=DEFAULT_DB_ALIAS)
|
||||
collector.collect([obj])
|
||||
return collector
|
||||
|
||||
@extend_schema(responses=GenericModelReferenceSerializer(many=True), parameters=[
|
||||
OpenApiParameter(name='cache', description='If results can be cached or not', type=bool, default=True),
|
||||
])
|
||||
@decorators.action(detail=True, methods=['GET'], serializer_class=GenericModelReferenceSerializer)
|
||||
@paginate
|
||||
def protecting(self, request, pk):
|
||||
"""
|
||||
get a paginated list of objects that are protecting the selected object form being deleted
|
||||
"""
|
||||
obj = self.queryset.filter(pk=pk, space=request.space).first()
|
||||
if obj:
|
||||
CACHE_KEY = f'DELETE_COLLECTOR_{request.space.pk}_PROTECTING_{obj.__class__.__name__}_{obj.pk}'
|
||||
cache = self.request.query_params.get('cache', "true") == "true"
|
||||
if caches['default'].has_key(CACHE_KEY) and cache:
|
||||
return caches['default'].get(CACHE_KEY)
|
||||
|
||||
collector = self.collect(obj)
|
||||
protected_objects = []
|
||||
for o in collector.protected:
|
||||
protected_objects.append({
|
||||
'id': o.pk,
|
||||
'model': o.__class__.__name__,
|
||||
'name': str(o),
|
||||
})
|
||||
|
||||
caches['default'].set(CACHE_KEY, protected_objects, 60)
|
||||
return protected_objects
|
||||
else:
|
||||
return []
|
||||
|
||||
@extend_schema(responses=GenericModelReferenceSerializer(many=True), parameters=[
|
||||
OpenApiParameter(name='cache', description='If results can be cached or not', type=bool, default=True),
|
||||
])
|
||||
@decorators.action(detail=True, methods=['GET'], serializer_class=GenericModelReferenceSerializer)
|
||||
@paginate
|
||||
def cascading(self, request, pk):
|
||||
"""
|
||||
get a paginated list of objects that will be cascaded (deleted) when deleting the selected object
|
||||
"""
|
||||
obj = self.queryset.filter(pk=pk, space=request.space).first()
|
||||
if obj:
|
||||
CACHE_KEY = f'DELETE_COLLECTOR_{request.space.pk}_CASCADING_{obj.__class__.__name__}_{obj.pk}'
|
||||
cache = self.request.query_params.get('cache', "true") == "true"
|
||||
if caches['default'].has_key(CACHE_KEY) and cache:
|
||||
return caches['default'].get(CACHE_KEY)
|
||||
|
||||
collector = self.collect(obj)
|
||||
|
||||
cascading_objects = []
|
||||
for model, objs in collector.model_objs.items():
|
||||
for o in objs:
|
||||
if o.pk != pk and o.__class__.__name__ != obj.__class__.__name__:
|
||||
cascading_objects.append({
|
||||
'id': o.pk,
|
||||
'model': o.__class__.__name__,
|
||||
'name': str(o),
|
||||
})
|
||||
caches['default'].set(CACHE_KEY, cascading_objects, 60)
|
||||
return cascading_objects
|
||||
else:
|
||||
return []
|
||||
|
||||
@extend_schema(responses=GenericModelReferenceSerializer(many=True), parameters=[
|
||||
OpenApiParameter(name='cache', description='If results can be cached or not', type=bool, default=True),
|
||||
])
|
||||
@decorators.action(detail=True, methods=['GET'], serializer_class=GenericModelReferenceSerializer)
|
||||
@paginate
|
||||
def nulling(self, request, pk):
|
||||
"""
|
||||
get a paginated list of objects where the selected object will be removed whe its deleted
|
||||
"""
|
||||
obj = self.queryset.filter(pk=pk, space=request.space).first()
|
||||
if obj:
|
||||
CACHE_KEY = f'DELETE_COLLECTOR_{request.space.pk}_NULLING_{obj.__class__.__name__}_{obj.pk}'
|
||||
cache = self.request.query_params.get('cache', "true") == "true"
|
||||
if caches['default'].has_key(CACHE_KEY) and cache:
|
||||
return caches['default'].get(CACHE_KEY)
|
||||
|
||||
collector = self.collect(obj)
|
||||
|
||||
nulling_objects = []
|
||||
# field_updates is a dict of relations that will be updated and querysets of items affected
|
||||
for key, value in collector.field_updates.items():
|
||||
# iterate over each queryset for relation
|
||||
for qs in value:
|
||||
# itereate over each object in queryset of relation
|
||||
for o in qs:
|
||||
nulling_objects.append(
|
||||
{
|
||||
'id': o.pk,
|
||||
'model': o.__class__.__name__,
|
||||
'name': str(o),
|
||||
}
|
||||
)
|
||||
caches['default'].set(CACHE_KEY, nulling_objects, 60)
|
||||
return nulling_objects
|
||||
else:
|
||||
return []
|
||||
|
||||
|
||||
@extend_schema_view(list=extend_schema(parameters=[
|
||||
OpenApiParameter(name='filter_list', description='User IDs, repeat for multiple', type=str, many=True),
|
||||
]))
|
||||
@@ -544,9 +685,9 @@ class GroupViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class SpaceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
queryset = Space.objects
|
||||
serializer_class = SpaceSerializer
|
||||
permission_classes = [IsReadOnlyDRF & CustomIsGuest | CustomIsOwner & CustomIsAdmin & CustomTokenHasReadWriteScope]
|
||||
permission_classes = [((IsReadOnlyDRF | IsCreateDRF) & CustomIsGuest) | CustomIsOwner & CustomIsAdmin & CustomTokenHasReadWriteScope]
|
||||
pagination_class = DefaultPagination
|
||||
http_method_names = ['get', 'patch']
|
||||
http_method_names = ['get', 'post', 'put', 'patch']
|
||||
|
||||
def get_queryset(self):
|
||||
return self.queryset.filter(
|
||||
@@ -565,7 +706,7 @@ class SpaceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class UserSpaceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
queryset = UserSpace.objects
|
||||
serializer_class = UserSpaceSerializer
|
||||
permission_classes = [(CustomIsSpaceOwner | CustomIsOwnerReadOnly) & CustomTokenHasReadWriteScope]
|
||||
permission_classes = [(CustomIsSpaceOwner | (IsReadOnlyDRF & CustomIsUser) | CustomIsOwnerReadOnly) & CustomTokenHasReadWriteScope]
|
||||
http_method_names = ['get', 'put', 'patch', 'delete']
|
||||
pagination_class = DefaultPagination
|
||||
|
||||
@@ -579,10 +720,23 @@ class UserSpaceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
if internal_note is not None:
|
||||
self.queryset = self.queryset.filter(internal_note=internal_note)
|
||||
|
||||
if is_space_owner(self.request.user, self.request.space):
|
||||
# >= admins can see all users, guest/user can only see themselves
|
||||
if has_group_permission(self.request.user, ['admin']):
|
||||
return self.queryset.filter(space=self.request.space)
|
||||
else:
|
||||
return self.queryset.filter(user=self.request.user, space=self.request.space)
|
||||
return self.queryset.filter(space=self.request.space, user=self.request.user)
|
||||
|
||||
@extend_schema(responses=UserSpaceSerializer(many=True))
|
||||
@decorators.action(detail=False, pagination_class=DefaultPagination, methods=['GET'], serializer_class=UserSpaceSerializer, )
|
||||
def all_personal(self, request):
|
||||
"""
|
||||
return all userspaces for the user requesting the endpoint
|
||||
:param request:
|
||||
:return:
|
||||
"""
|
||||
with scopes_disabled():
|
||||
self.queryset = self.queryset.filter(user=self.request.user)
|
||||
return Response(self.serializer_class(self.queryset.all(), many=True, context={'request': self.request}).data)
|
||||
|
||||
|
||||
class UserPreferenceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
@@ -620,7 +774,7 @@ class SearchPreferenceViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.queryset.filter(user=self.request.user)
|
||||
|
||||
|
||||
class AiProviderViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class AiProviderViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
queryset = AiProvider.objects
|
||||
serializer_class = AiProviderSerializer
|
||||
permission_classes = [CustomAiProviderPermission & CustomTokenHasReadWriteScope]
|
||||
@@ -643,7 +797,7 @@ class AiLogViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.queryset.filter(space=self.request.space)
|
||||
|
||||
|
||||
class StorageViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class StorageViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
# TODO handle delete protect error and adjust test
|
||||
queryset = Storage.objects
|
||||
serializer_class = StorageSerializer
|
||||
@@ -654,7 +808,7 @@ class StorageViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.queryset.filter(space=self.request.space)
|
||||
|
||||
|
||||
class SyncViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class SyncViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
queryset = Sync.objects
|
||||
serializer_class = SyncSerializer
|
||||
permission_classes = [CustomIsAdmin & CustomTokenHasReadWriteScope]
|
||||
@@ -715,7 +869,7 @@ class RecipeImportViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return Response({'msg': 'ok'}, status=status.HTTP_200_OK)
|
||||
|
||||
|
||||
class ConnectorConfigViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class ConnectorConfigViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
queryset = ConnectorConfig.objects
|
||||
serializer_class = ConnectorConfigSerializer
|
||||
permission_classes = [CustomIsAdmin & CustomTokenHasReadWriteScope]
|
||||
@@ -725,7 +879,7 @@ class ConnectorConfigViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.queryset.filter(space=self.request.space)
|
||||
|
||||
|
||||
class SupermarketViewSet(LoggingMixin, StandardFilterModelViewSet):
|
||||
class SupermarketViewSet(LoggingMixin, StandardFilterModelViewSet, DeleteRelationMixing):
|
||||
queryset = Supermarket.objects
|
||||
serializer_class = SupermarketSerializer
|
||||
permission_classes = [CustomIsUser & CustomTokenHasReadWriteScope]
|
||||
@@ -737,7 +891,7 @@ class SupermarketViewSet(LoggingMixin, StandardFilterModelViewSet):
|
||||
|
||||
|
||||
# TODO does supermarket category have settings to support fuzzy filtering and/or merge?
|
||||
class SupermarketCategoryViewSet(LoggingMixin, FuzzyFilterMixin, MergeMixin):
|
||||
class SupermarketCategoryViewSet(LoggingMixin, FuzzyFilterMixin, MergeMixin, DeleteRelationMixing):
|
||||
queryset = SupermarketCategory.objects
|
||||
model = SupermarketCategory
|
||||
serializer_class = SupermarketCategorySerializer
|
||||
@@ -760,7 +914,7 @@ class SupermarketCategoryRelationViewSet(LoggingMixin, StandardFilterModelViewSe
|
||||
return super().get_queryset()
|
||||
|
||||
|
||||
class KeywordViewSet(LoggingMixin, TreeMixin):
|
||||
class KeywordViewSet(LoggingMixin, TreeMixin, DeleteRelationMixing):
|
||||
queryset = Keyword.objects
|
||||
model = Keyword
|
||||
serializer_class = KeywordSerializer
|
||||
@@ -768,7 +922,7 @@ class KeywordViewSet(LoggingMixin, TreeMixin):
|
||||
pagination_class = DefaultPagination
|
||||
|
||||
|
||||
class UnitViewSet(LoggingMixin, MergeMixin, FuzzyFilterMixin):
|
||||
class UnitViewSet(LoggingMixin, MergeMixin, FuzzyFilterMixin, DeleteRelationMixing):
|
||||
queryset = Unit.objects
|
||||
model = Unit
|
||||
serializer_class = UnitSerializer
|
||||
@@ -788,7 +942,7 @@ class FoodInheritFieldViewSet(LoggingMixin, viewsets.ReadOnlyModelViewSet):
|
||||
return super().get_queryset()
|
||||
|
||||
|
||||
class FoodViewSet(LoggingMixin, TreeMixin):
|
||||
class FoodViewSet(LoggingMixin, TreeMixin, DeleteRelationMixing):
|
||||
queryset = Food.objects
|
||||
model = Food
|
||||
serializer_class = FoodSerializer
|
||||
@@ -934,6 +1088,82 @@ class FoodViewSet(LoggingMixin, TreeMixin):
|
||||
return JsonResponse({'msg': 'there was an error parsing the FDC data, please check the server logs'},
|
||||
status=500, json_dumps_params={'indent': 4})
|
||||
|
||||
@extend_schema(
|
||||
parameters=[
|
||||
OpenApiParameter(name='provider', description='ID of the AI provider that should be used for this AI request', type=int),
|
||||
]
|
||||
)
|
||||
@decorators.action(detail=True, methods=['POST'], )
|
||||
def aiproperties(self, request, pk):
|
||||
serializer = RecipeSerializer(data=request.data, partial=True, context={'request': request})
|
||||
if serializer.is_valid():
|
||||
|
||||
if not request.query_params.get('provider', None) or not re.match(r'^(\d)+$', request.query_params.get('provider', None)):
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': _('You must select an AI provider to perform your request.'),
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
if not can_perform_ai_request(request.space):
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': _("You don't have any credits remaining to use AI or AI features are not enabled for your space."),
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
ai_provider = AiProvider.objects.filter(pk=request.query_params.get('provider')).filter(Q(space=request.space) | Q(space__isnull=True)).first()
|
||||
|
||||
litellm.callbacks = [AiCallbackHandler(request.space, request.user, ai_provider, AiLog.F_FOOD_PROPERTIES)]
|
||||
|
||||
property_type_list = list(PropertyType.objects.filter(space=request.space).values('id', 'name', 'description', 'unit', 'category', 'fdc_id'))
|
||||
messages = [
|
||||
{
|
||||
"role": "user",
|
||||
"content": [
|
||||
{
|
||||
"type": "text",
|
||||
"text": "Given the following food and the following different types of properties please update the food so that the properties attribute contains a list with all property types in the following format [{property_amount: <the property value>, property_type: {id: <the ID of the property type>, name: <the name of the property type>}}]."
|
||||
"The property values should be in the unit given in the property type and for the amount specified in the properties_food_amount attribute of the food, which is given in the properties_food_unit."
|
||||
"property_amount is a decimal number. Please try to keep a percision of two decimal places if given in your source data."
|
||||
"Do not make up any data. If there is no data available for the given property type that is ok, just return null as a property_amount for that property type. Do not change anything else!"
|
||||
"Most property types are likely going to be nutritional values. Please do not make up any values, only return values you can find in the sources available to you."
|
||||
"Only return values if you are sure they are meant for the food given. Under no circumstance are you allowed to change any other value of the given food or change the structure in any way or form."
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"text": json.dumps(request.data)
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"text": json.dumps(property_type_list)
|
||||
},
|
||||
]
|
||||
},
|
||||
]
|
||||
|
||||
try:
|
||||
ai_request = {
|
||||
'api_key': ai_provider.api_key,
|
||||
'model': ai_provider.model_name,
|
||||
'response_format': {"type": "json_object"},
|
||||
'messages': messages,
|
||||
}
|
||||
if ai_provider.url:
|
||||
ai_request['api_base'] = ai_provider.url
|
||||
ai_response = completion(**ai_request)
|
||||
|
||||
response_text = ai_response.choices[0].message.content
|
||||
|
||||
return Response(json.loads(response_text), status=status.HTTP_200_OK)
|
||||
except BadRequestError as err:
|
||||
pass
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': 'The AI could not process your request. \n\n' + err.message,
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
def destroy(self, *args, **kwargs):
|
||||
try:
|
||||
return (super().destroy(self, *args, **kwargs))
|
||||
@@ -1036,7 +1266,7 @@ class FoodViewSet(LoggingMixin, TreeMixin):
|
||||
OpenApiParameter(name='order_direction', description='Order ascending or descending', type=str,
|
||||
enum=['asc', 'desc']),
|
||||
]))
|
||||
class RecipeBookViewSet(LoggingMixin, StandardFilterModelViewSet):
|
||||
class RecipeBookViewSet(LoggingMixin, StandardFilterModelViewSet, DeleteRelationMixing):
|
||||
queryset = RecipeBook.objects
|
||||
serializer_class = RecipeBookSerializer
|
||||
permission_classes = [(CustomIsOwner | CustomIsShared) & CustomTokenHasReadWriteScope]
|
||||
@@ -1190,7 +1420,7 @@ class AutoPlanViewSet(LoggingMixin, mixins.CreateModelMixin, viewsets.GenericVie
|
||||
return Response(serializer.errors, 400)
|
||||
|
||||
|
||||
class MealTypeViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class MealTypeViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
"""
|
||||
returns list of meal types created by the
|
||||
requesting user ordered by the order field.
|
||||
@@ -1222,7 +1452,19 @@ class IngredientViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.serializer_class
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = self.queryset.filter(step__recipe__space=self.request.space)
|
||||
queryset = self.queryset.prefetch_related('food',
|
||||
'food__properties',
|
||||
'food__properties__property_type',
|
||||
'food__inherit_fields',
|
||||
'food__supermarket_category',
|
||||
'food__onhand_users',
|
||||
'food__substitute',
|
||||
'food__child_inherit_fields',
|
||||
'unit',
|
||||
'unit__unit_conversion_base_relation',
|
||||
'unit__unit_conversion_base_relation__base_unit',
|
||||
'unit__unit_conversion_converted_relation',
|
||||
'unit__unit_conversion_converted_relation__converted_unit', ).filter(step__recipe__space=self.request.space)
|
||||
food = self.request.query_params.get('food', None)
|
||||
if food and re.match(r'^(\d)+$', food):
|
||||
queryset = queryset.filter(food_id=food)
|
||||
@@ -1328,7 +1570,7 @@ class RecipePagination(PageNumberPagination):
|
||||
OpenApiParameter(name='filter', description=_('ID of a custom filter. Returns all recipes matched by that filter.'), type=int),
|
||||
OpenApiParameter(name='makenow', description=_('Filter recipes that can be made with OnHand food. [''true''/''<b>false</b>'']'), type=bool),
|
||||
]))
|
||||
class RecipeViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class RecipeViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
queryset = Recipe.objects
|
||||
serializer_class = RecipeSerializer
|
||||
# TODO split read and write permission for meal plan guest
|
||||
@@ -1610,7 +1852,7 @@ class UnitConversionViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
enum=[m[0] for m in PropertyType.CHOICES])
|
||||
]
|
||||
))
|
||||
class PropertyTypeViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
class PropertyTypeViewSet(LoggingMixin, viewsets.ModelViewSet, DeleteRelationMixing):
|
||||
queryset = PropertyType.objects
|
||||
serializer_class = PropertyTypeSerializer
|
||||
permission_classes = [CustomIsUser & CustomTokenHasReadWriteScope]
|
||||
@@ -1841,7 +2083,7 @@ class BookmarkletImportViewSet(LoggingMixin, viewsets.ModelViewSet):
|
||||
return self.queryset.filter(space=self.request.space).all()
|
||||
|
||||
|
||||
class UserFileViewSet(LoggingMixin, StandardFilterModelViewSet):
|
||||
class UserFileViewSet(LoggingMixin, StandardFilterModelViewSet, DeleteRelationMixing):
|
||||
queryset = UserFile.objects
|
||||
serializer_class = UserFileSerializer
|
||||
permission_classes = [CustomIsUser & CustomTokenHasReadWriteScope]
|
||||
@@ -1893,8 +2135,8 @@ class InviteLinkViewSet(LoggingMixin, StandardFilterModelViewSet):
|
||||
if internal_note is not None:
|
||||
self.queryset = self.queryset.filter(internal_note=internal_note)
|
||||
|
||||
unused = self.request.query_params.get('unused', False)
|
||||
if unused:
|
||||
used = self.request.query_params.get('used', False)
|
||||
if not used:
|
||||
self.queryset = self.queryset.filter(used_by=None)
|
||||
|
||||
if is_space_owner(self.request.user, self.request.space):
|
||||
@@ -2130,7 +2372,7 @@ class AiImportView(APIView):
|
||||
|
||||
ai_provider = AiProvider.objects.filter(pk=serializer.validated_data['ai_provider_id']).filter(Q(space=request.space) | Q(space__isnull=True)).first()
|
||||
|
||||
litellm.callbacks = [AiCallbackHandler(request.space, request.user, ai_provider)]
|
||||
litellm.callbacks = [AiCallbackHandler(request.space, request.user, ai_provider, AiLog.F_FILE_IMPORT)]
|
||||
|
||||
messages = []
|
||||
uploaded_file = serializer.validated_data['file']
|
||||
@@ -2247,6 +2489,80 @@ class AiImportView(APIView):
|
||||
return Response(RecipeFromSourceResponseSerializer(context={'request': request}).to_representation(response), status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
class AiStepSortView(APIView):
|
||||
throttle_classes = [AiEndpointThrottle]
|
||||
permission_classes = [CustomIsUser & CustomTokenHasReadWriteScope]
|
||||
|
||||
@extend_schema(request=RecipeSerializer(many=False), responses=RecipeSerializer(many=False),
|
||||
parameters=[
|
||||
OpenApiParameter(name='provider', description='ID of the AI provider that should be used for this AI request', type=int),
|
||||
])
|
||||
def post(self, request, *args, **kwargs):
|
||||
"""
|
||||
given an image or PDF file convert its content to a structured recipe using AI and the scraping system
|
||||
"""
|
||||
serializer = RecipeSerializer(data=request.data, partial=True, context={'request': request})
|
||||
if serializer.is_valid():
|
||||
|
||||
if not request.query_params.get('provider', None) or not re.match(r'^(\d)+$', request.query_params.get('provider', None)):
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': _('You must select an AI provider to perform your request.'),
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
if not can_perform_ai_request(request.space):
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': _("You don't have any credits remaining to use AI or AI features are not enabled for your space."),
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
ai_provider = AiProvider.objects.filter(pk=request.query_params.get('provider')).filter(Q(space=request.space) | Q(space__isnull=True)).first()
|
||||
|
||||
litellm.callbacks = [AiCallbackHandler(request.space, request.user, ai_provider, AiLog.F_STEP_SORT)]
|
||||
|
||||
messages = [
|
||||
{
|
||||
"role": "user",
|
||||
"content": [
|
||||
{
|
||||
"type": "text",
|
||||
"text": "You are given data for a recipe formatted as json. You cannot under any circumstance change the value of any of the fields. You are only allowed to split the instructions into multiple steps and to sort the ingredients to their appropriate step. Your goal is to properly structure the recipe by splitting large instructions into multiple coherent steps and putting the ingredients that belong to this step into the ingredients list. Generally an ingredient of a cooking recipe should occur in the first step where its needed. Please sort the ingredients to the appropriate steps without changing any of the actual field values. Return the recipe in the same format you were given as json. Do not change any field value like strings or numbers, or change the sorting, also do not change the language."
|
||||
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"text": json.dumps(request.data)
|
||||
},
|
||||
|
||||
]
|
||||
},
|
||||
]
|
||||
|
||||
try:
|
||||
ai_request = {
|
||||
'api_key': ai_provider.api_key,
|
||||
'model': ai_provider.model_name,
|
||||
'response_format': {"type": "json_object"},
|
||||
'messages': messages,
|
||||
}
|
||||
if ai_provider.url:
|
||||
ai_request['api_base'] = ai_provider.url
|
||||
ai_response = completion(**ai_request)
|
||||
|
||||
response_text = ai_response.choices[0].message.content
|
||||
# TODO validate by loading/dumping using serializer ?
|
||||
|
||||
return Response(json.loads(response_text), status=status.HTTP_200_OK)
|
||||
except BadRequestError as err:
|
||||
response = {
|
||||
'error': True,
|
||||
'msg': 'The AI could not process your request. \n\n' + err.message,
|
||||
}
|
||||
return Response(response, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
class AppImportView(APIView):
|
||||
parser_classes = [MultiPartParser]
|
||||
throttle_classes = [RecipeImportThrottle]
|
||||
@@ -2267,7 +2583,13 @@ class AppImportView(APIView):
|
||||
files = []
|
||||
for f in request.FILES.getlist('files'):
|
||||
files.append({'file': io.BytesIO(f.read()), 'name': f.name})
|
||||
t = threading.Thread(target=integration.do_import, args=[files, il, form.cleaned_data['duplicates']])
|
||||
t = threading.Thread(target=integration.do_import,
|
||||
args=[files, il, form.cleaned_data['duplicates']],
|
||||
kwargs={'meal_plans': form.cleaned_data['meal_plans'],
|
||||
'shopping_lists': form.cleaned_data['shopping_lists'],
|
||||
'nutrition_per_serving': form.cleaned_data['nutrition_per_serving']
|
||||
}
|
||||
)
|
||||
t.setDaemon(True)
|
||||
t.start()
|
||||
|
||||
|
||||
@@ -17,6 +17,7 @@ from cookbook.integration.copymethat import CopyMeThat
|
||||
from cookbook.integration.default import Default
|
||||
from cookbook.integration.domestica import Domestica
|
||||
from cookbook.integration.mealie import Mealie
|
||||
from cookbook.integration.mealie1 import Mealie1
|
||||
from cookbook.integration.mealmaster import MealMaster
|
||||
from cookbook.integration.melarecipes import MelaRecipes
|
||||
from cookbook.integration.nextcloud_cookbook import NextcloudCookbook
|
||||
@@ -45,6 +46,8 @@ def get_integration(request, export_type):
|
||||
return NextcloudCookbook(request, export_type)
|
||||
if export_type == ImportExportBase.MEALIE:
|
||||
return Mealie(request, export_type)
|
||||
if export_type == ImportExportBase.MEALIE1:
|
||||
return Mealie1(request, export_type)
|
||||
if export_type == ImportExportBase.CHOWDOWN:
|
||||
return Chowdown(request, export_type)
|
||||
if export_type == ImportExportBase.SAFFRON:
|
||||
|
||||
@@ -54,7 +54,7 @@ def hook(request, token):
|
||||
f = ingredient_parser.get_food(food)
|
||||
u = ingredient_parser.get_unit(unit)
|
||||
|
||||
ShoppingListEntry.objects.create(food=f, unit=u, amount=amount, created_by=request.user, space=request.space)
|
||||
ShoppingListEntry.objects.create(food=f, unit=u, amount=max(1, amount), created_by=request.user, space=request.space)
|
||||
|
||||
return JsonResponse({'data': data['message']['text']})
|
||||
except Exception:
|
||||
|
||||
@@ -21,7 +21,7 @@ from django.http import HttpResponseRedirect, JsonResponse, HttpResponse
|
||||
from django.shortcuts import get_object_or_404, render
|
||||
from django.templatetags.static import static
|
||||
from django.urls import reverse, reverse_lazy
|
||||
from django.utils.datetime_safe import date
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext as _
|
||||
from django_scopes import scopes_disabled
|
||||
from drf_spectacular.views import SpectacularRedocView, SpectacularSwaggerView
|
||||
@@ -42,6 +42,9 @@ def index(request, path=None, resource=None):
|
||||
if User.objects.count() < 1 and 'django.contrib.auth.backends.RemoteUserBackend' not in settings.AUTHENTICATION_BACKENDS:
|
||||
return HttpResponseRedirect(reverse_lazy('view_setup'))
|
||||
|
||||
if 'signup_token' in request.session:
|
||||
return HttpResponseRedirect(reverse('view_invite', args=[request.session.pop('signup_token', '')]))
|
||||
|
||||
if request.user.is_authenticated or re.search(r'/recipe/\d+/', request.path[:512]) and request.GET.get('share'):
|
||||
return render(request, 'frontend/tandoor.html', {})
|
||||
else:
|
||||
@@ -98,7 +101,7 @@ def space_overview(request):
|
||||
max_users=settings.SPACE_DEFAULT_MAX_USERS,
|
||||
allow_sharing=settings.SPACE_DEFAULT_ALLOW_SHARING,
|
||||
ai_enabled=settings.SPACE_AI_ENABLED,
|
||||
ai_credits_monthly=settings.SPACE_AI_CREDITS_MONTHLY,)
|
||||
ai_credits_monthly=settings.SPACE_AI_CREDITS_MONTHLY, )
|
||||
|
||||
user_space = UserSpace.objects.create(space=created_space, user=request.user, active=False)
|
||||
user_space.groups.add(Group.objects.filter(name='admin').get())
|
||||
@@ -223,7 +226,7 @@ def system(request):
|
||||
total_stats = ['All', int(r.get('api:request-count'))]
|
||||
|
||||
for i in range(0, 6):
|
||||
d = (date.today() - timedelta(days=i)).isoformat()
|
||||
d = (timezone.now() - timedelta(days=i)).isoformat()
|
||||
api_stats[0].append(d)
|
||||
api_space_stats[0].append(d)
|
||||
total_stats.append(int(r.get(f'api:request-count:{d}')) if r.get(f'api:request-count:{d}') else 0)
|
||||
@@ -234,7 +237,7 @@ def system(request):
|
||||
endpoint = x[0].decode('utf-8')
|
||||
endpoint_stats = [endpoint, x[1]]
|
||||
for i in range(0, 6):
|
||||
d = (date.today() - timedelta(days=i)).isoformat()
|
||||
d = (timezone.now() - timedelta(days=i)).isoformat()
|
||||
endpoint_stats.append(r.zscore(f'api:endpoint-request-count:{d}', endpoint))
|
||||
api_stats.append(endpoint_stats)
|
||||
|
||||
@@ -243,7 +246,7 @@ def system(request):
|
||||
if space := Space.objects.filter(pk=s).first():
|
||||
space_stats = [space.name, x[1]]
|
||||
for i in range(0, 6):
|
||||
d = (date.today() - timedelta(days=i)).isoformat()
|
||||
d = (timezone.now() - timedelta(days=i)).isoformat()
|
||||
space_stats.append(r.zscore(f'api:space-request-count:{d}', s))
|
||||
api_space_stats.append(space_stats)
|
||||
|
||||
@@ -322,7 +325,7 @@ def invite_link(request, token):
|
||||
try:
|
||||
token = UUID(token, version=4)
|
||||
except ValueError:
|
||||
messages.add_message(request, messages.ERROR, _('Malformed Invite Link supplied!'))
|
||||
print('Malformed Invite Link supplied!')
|
||||
return HttpResponseRedirect(reverse('index'))
|
||||
|
||||
if link := InviteLink.objects.filter(valid_until__gte=datetime.today(), used_by=None, uuid=token).first():
|
||||
@@ -331,22 +334,17 @@ def invite_link(request, token):
|
||||
link.used_by = request.user
|
||||
link.save()
|
||||
|
||||
user_space = UserSpace.objects.create(user=request.user, space=link.space, internal_note=link.internal_note, invite_link=link, active=False)
|
||||
|
||||
if request.user.userspace_set.count() == 1:
|
||||
user_space.active = True
|
||||
user_space.save()
|
||||
UserSpace.objects.filter(user=request.user).update(active=False)
|
||||
user_space = UserSpace.objects.create(user=request.user, space=link.space, internal_note=link.internal_note, invite_link=link, active=True)
|
||||
|
||||
user_space.groups.add(link.group)
|
||||
|
||||
messages.add_message(request, messages.SUCCESS, _('Successfully joined space.'))
|
||||
return HttpResponseRedirect(reverse('view_space_overview'))
|
||||
return HttpResponseRedirect(reverse('index'))
|
||||
else:
|
||||
request.session['signup_token'] = str(token)
|
||||
return HttpResponseRedirect(reverse('account_signup'))
|
||||
|
||||
messages.add_message(request, messages.ERROR, _('Invite Link not valid or already used!'))
|
||||
return HttpResponseRedirect(reverse('view_space_overview'))
|
||||
return HttpResponseRedirect(reverse('index'))
|
||||
|
||||
|
||||
def report_share_abuse(request, token):
|
||||
|
||||
@@ -97,10 +97,17 @@ Follow these steps to import your recipes
|
||||
|
||||
Mealie provides structured data similar to nextcloud.
|
||||
|
||||
!!! WARNING "Versions"
|
||||
There are two different versions of the Mealie importer. One for all backups created prior to Version 1.0 and one for all after.
|
||||
|
||||
!!! INFO "Versions"
|
||||
The Mealie UI does not indicate weather or not nutrition information is stored per serving or per recipe. This choice is left to the user. During the import you will have to choose
|
||||
how Tandoor should treat your nutrition data.
|
||||
|
||||
To migrate your recipes
|
||||
|
||||
1. Go to your Mealie settings and create a new Backup.
|
||||
2. Download the backup by clicking on it and pressing download (this wasn't working for me, so I had to manually pull it from the server).
|
||||
1. Go to your Mealie admin settings and create a new backup.
|
||||
2. Download the backup.
|
||||
3. Upload the entire `.zip` file to the importer page and import everything.
|
||||
|
||||
## Chowdown
|
||||
|
||||
@@ -212,6 +212,6 @@ location /static/ {
|
||||
Tandoor 1 includes gunicorn, a python WSGI server that handles python code well but is not meant to serve mediafiles. Thus, it has always been recommended to set up a nginx webserver
|
||||
(not just a reverse proxy) in front of Tandoor to handle mediafiles. The gunicorn server by default is exposed on port 8080.
|
||||
|
||||
Tandoor 2 now occasionally bundles nginx inside the container and exposes port 80 where mediafiles are handled by nginx and all the other requests are (mostly) passed to gunicorn.
|
||||
Tandoor 2 now bundles nginx inside the container and exposes port 80 where mediafiles are handled by nginx and all the other requests are (mostly) passed to gunicorn.
|
||||
|
||||
A [GitHub Issue](https://github.com/TandoorRecipes/recipes/issues/3851) has been created to allow for discussions and FAQ's on this issue while this change is fresh. It will later be updated in the docs here if necessary.
|
||||
@@ -1,9 +1,6 @@
|
||||
!!! info "Community Contributed"
|
||||
This guide was contributed by the community and is neither officially supported, nor updated or tested. Since I cannot test it myself, feedback and improvements are always very welcome.
|
||||
|
||||
!!! danger "Tandoor 2 Compatibility"
|
||||
This guide has not been verified/tested for Tandoor 2, which now integrates a nginx service inside the default docker container and exposes its service on port 80 instead of 8080.
|
||||
|
||||
## **Instructions**
|
||||
|
||||
Basic guide to setup `vabenee1111/recipes` docker container on Synology NAS.
|
||||
@@ -12,7 +9,7 @@ Basic guide to setup `vabenee1111/recipes` docker container on Synology NAS.
|
||||
- Login to Synology DSM through your browser
|
||||
- Install `Container Manager` through package center
|
||||
- Install `Text Editor` through package center (needed to edit `.env` if you don't edit it locally first)
|
||||
- If you do not already have a `docker` folder in your File Station, create one at the root of your volume.
|
||||
- If you do not already have a shared `docker` folder in your File Station, create one at the root of your volume.
|
||||
- inside of your `volume1/docker` folder, create a `recipes` folder.
|
||||
- Within, create the necessary folder structure. You will need these folders:
|
||||
|
||||
@@ -44,9 +41,13 @@ volume1/docker/
|
||||
|
||||
### 4. Edit docker-compose.yml
|
||||
- Paste the `docker-compose.yml` into the `source` textbox.
|
||||
- This file tells docker how to setup recipes. Docker will create three containers for recipes to work, recipes, nginx and postgresql. They are all required and need to store and share data through the folders you created before.
|
||||
- Under the `nginx_recipes` section, look for `ports` that lists `80:80` as the default. This line specifies which external synology port will point to which internal docker port. Chose a free port to use and replace the first number with it. You will open recipes by browsing to http://your.synology.ip:chosen.port, e.g. http://192.168.1.1:2000
|
||||
- If you want to use port 2000 you would edit the `ports` to `2000:80`
|
||||
- This file tells docker how to setup recipes. Docker will create two containers for recipes to work, `web_recipes` and `db_recipes`. They are all required and need to store and share data through the folders you created before.
|
||||
- Under the `web_recipes` section, you can add `ports` . This line specifies which external synology port will point to which internal docker port. Chose a free port to use and replace the first number with it. You will open recipes by browsing to http://your.synology.ip:chosen.port, e.g. http://192.168.1.1:2000
|
||||
- If you want to use port 2000 you would edit the `ports` to `2000:8080`
|
||||
```
|
||||
ports:
|
||||
- "2000:8080"
|
||||
```
|
||||
|
||||
### 5. Finishing up
|
||||
- Click `Next`.
|
||||
@@ -59,25 +60,27 @@ volume1/docker/
|
||||
Container recipes-db_recipes-1 Started
|
||||
Container recipes-web_recipes-1 Starting
|
||||
Container recipes-web_recipes-1 Started
|
||||
Container recipes-nginx_recipes-1 Starting
|
||||
Container recipes-nginx_recipes-1 Started
|
||||
Exit Code: 0
|
||||
```
|
||||
- If you get an error, review the error and fix. A common reason it might fail is because you did not create the folders specified in the directory tree in step 1.
|
||||
- If you notice that `web_recipes` cannot connect to the database there is probably a misconfiguration in you Firewall, see next section.
|
||||
- Browse to 192.168.1.1:2000 or whatever your IP and port are
|
||||
|
||||
### 6. Firewall
|
||||
!!!info "Depreciated?" This section may be depreciated and may no longer needed. The container may be able to be used without any firewall rules enabled. Further datapoints needed before section or this warning is removed.
|
||||
|
||||
You need to set up firewall rules in order for the recipes_web container to be able to connect to the recipes_db container.
|
||||
You need to set up firewall rules in order for the `recipes_web` container to be able to connect to the `recipes_db` container.
|
||||
|
||||
- Control Panel -> Security -> Firewall -> Edit Rules -> Create
|
||||
- Ports: All
|
||||
- if you only want to allow the database port 5432, you'll need to create 2 rules to allow ingoing and outgoing port (and eventually also specify ip more restrictive)
|
||||
- Source IP: Specific IP -> Select -> Subnet
|
||||
- insert docker network ip (can be found in the docker application, network tab)
|
||||
- Example: IP address: 172.18.0.0 and Subnet mask/Prefix length: 255.255.255.0
|
||||
- Action: Allow
|
||||
- Save and make sure it's above the deny rules
|
||||
- If you want to skip the SSL Setup via Reverse Proxy you'll also need to allow access to the port you used to expose the container (e.g. 2000 in this example).
|
||||
- Ports: 2000
|
||||
- Source: depends on how broad the access should be
|
||||
- Action: Allow
|
||||
|
||||
### 7. Additional SSL Setup
|
||||
Easiest way is to do it via Reverse Proxy.
|
||||
@@ -102,7 +105,7 @@ Easiest way is to do it via Reverse Proxy.
|
||||
- Action: Allow
|
||||
- Save and make sure it's above the deny rules
|
||||
|
||||
### 8. Depreciated Guides
|
||||
### 8. Deprecated Guides
|
||||
|
||||
The following are older guides that may be useful if you are running older versions of DSM.
|
||||
|
||||
@@ -111,4 +114,4 @@ The following are older guides that may be useful if you are running older versi
|
||||
|
||||
- There is also this
|
||||
([word](https://github.com/vabene1111/recipes/files/6708738/Tandoor.on.a.Synology.Disk.Station.docx),
|
||||
[pdf](https://github.com/vabene1111/recipes/files/6901601/Tandoor.on.a.Synology.Disk.Station.pdf)) awesome and very detailed guide provided by @DiversityBug.
|
||||
[pdf](https://github.com/vabene1111/recipes/files/6901601/Tandoor.on.a.Synology.Disk.Station.pdf)) awesome and very detailed guide provided by @DiversityBug.
|
||||
|
||||
@@ -20,11 +20,13 @@ server {
|
||||
location / {
|
||||
proxy_set_header Host $http_host;
|
||||
proxy_pass http://localhost:${TANDOOR_PORT};
|
||||
error_page 502 /errors/http502.html;
|
||||
|
||||
# disabled for now because it redirects to the error page and not back, also not showing html
|
||||
#error_page 502 /errors/http502.html;
|
||||
}
|
||||
|
||||
location /errors/ {
|
||||
alias /etc/nginx/conf.d/errorpages/;
|
||||
alias /etc/nginx/http.d/errorpages/;
|
||||
internal;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -8,7 +8,7 @@ msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: PACKAGE VERSION\n"
|
||||
"Report-Msgid-Bugs-To: \n"
|
||||
"POT-Creation-Date: 2024-08-01 15:04+0200\n"
|
||||
"POT-Creation-Date: 2025-09-22 20:15+0200\n"
|
||||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
|
||||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
|
||||
"Language-Team: LANGUAGE <LL@li.org>\n"
|
||||
@@ -69,70 +69,110 @@ msgstr ""
|
||||
msgid "end"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:478
|
||||
#: .\recipes\plugins\enterprise_plugin\templates\enterprise_social_public_view.html:15
|
||||
msgid "Tandoor Recipe Manager"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:571
|
||||
msgid "Armenian "
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:479
|
||||
#: .\recipes\settings.py:572
|
||||
msgid "Bulgarian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:480
|
||||
#: .\recipes\settings.py:573
|
||||
msgid "Catalan"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:481
|
||||
#: .\recipes\settings.py:574
|
||||
msgid "Czech"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:482
|
||||
#: .\recipes\settings.py:575
|
||||
msgid "Croatian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:576
|
||||
msgid "Danish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:483
|
||||
#: .\recipes\settings.py:577
|
||||
msgid "Dutch"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:484
|
||||
#: .\recipes\settings.py:578
|
||||
msgid "English"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:485
|
||||
#: .\recipes\settings.py:579
|
||||
msgid "French"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:486
|
||||
#: .\recipes\settings.py:580
|
||||
msgid "Finnish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:581
|
||||
msgid "German"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:487
|
||||
#: .\recipes\settings.py:582
|
||||
msgid "Greek"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:583
|
||||
msgid "Hebrew"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:584
|
||||
msgid "Hungarian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:488
|
||||
#: .\recipes\settings.py:585
|
||||
msgid "Italian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:489
|
||||
#: .\recipes\settings.py:586
|
||||
msgid "Latvian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:490
|
||||
msgid "Norwegian "
|
||||
#: .\recipes\settings.py:587
|
||||
msgid "Norwegian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:491
|
||||
#: .\recipes\settings.py:589
|
||||
msgid "Polish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:492
|
||||
#: .\recipes\settings.py:590
|
||||
msgid "Portuguese"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:591
|
||||
msgid "Russian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:493
|
||||
#: .\recipes\settings.py:592
|
||||
msgid "Romanian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:593
|
||||
msgid "Spanish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:494
|
||||
#: .\recipes\settings.py:594
|
||||
msgid "Slovenian"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:595
|
||||
msgid "Swedish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:596
|
||||
msgid "Turkish"
|
||||
msgstr ""
|
||||
|
||||
#: .\recipes\settings.py:597
|
||||
msgid "Ukranian"
|
||||
msgstr ""
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user