Skip to content

Improve filter_tokens in core tokenizer #174

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions uncoder-core/app/translator/core/parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
from app.translator.core.models.platform_details import PlatformDetails
from app.translator.core.models.query_container import RawQueryContainer, TokenizedQueryContainer
from app.translator.core.models.query_tokens.field import Field
from app.translator.core.models.query_tokens.field_field import FieldField
from app.translator.core.models.query_tokens.field_value import FieldValue
from app.translator.core.models.query_tokens.function_value import FunctionValue
from app.translator.core.tokenizer import QueryTokenizer
Expand Down Expand Up @@ -68,6 +69,11 @@ def get_field_tokens(
for token in query_tokens:
if isinstance(token, FieldValue):
field_tokens.append(token.field)
elif isinstance(token, FieldField):
if token.field_left:
field_tokens.append(token.field_left)
if token.field_right:
field_tokens.append(token.field_right)
elif isinstance(token, FunctionValue):
field_tokens.extend(self.tokenizer.get_field_tokens_from_func_args([token.function]))

Expand Down
6 changes: 4 additions & 2 deletions uncoder-core/app/translator/core/tokenizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -338,7 +338,7 @@ def tokenize(self, query: str) -> list[QUERY_TOKEN_TYPE]:
@staticmethod
def filter_tokens(
tokens: list[QUERY_TOKEN_TYPE],
token_type: Union[type[FieldValue], type[Field], type[Keyword], type[Identifier]],
token_type: Union[type[FieldValue], type[Field], type[FieldField], type[Keyword], type[Identifier]],
) -> list[QUERY_TOKEN_TYPE]:
return [token for token in tokens if isinstance(token, token_type)]

Expand All @@ -363,7 +363,9 @@ def get_field_tokens_from_func_args( # noqa: PLR0912
result.extend(self.get_field_tokens_from_func_args(args=arg.args))
result.extend(self.get_field_tokens_from_func_args(args=arg.by_clauses))
result.extend(self.get_field_tokens_from_func_args(args=[arg.filter_]))
elif isinstance(arg, (JoinFunction, UnionFunction)):
elif isinstance(arg, JoinFunction):
result.extend(self.get_field_tokens_from_func_args(args=arg.condition))
elif isinstance(arg, UnionFunction):
continue
elif isinstance(arg, Function):
result.extend(self.get_field_tokens_from_func_args(args=arg.args))
Expand Down
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy