API Reference
This section details the botex API. It consists of the command line interface botex
and a set of Python functions to set up and run oTree-based experiments using LLMs as bots.
The Python API can be divided into three main parts:
- Setup: Functions to set up the botex configuration and to start/stop a local llama.cpp server if needed.
- oTree Interface: Functions to interact with the oTree server, such as starting/stopping the server, reading session config data from it, initializing sessions, and running bots on sessions.
- Export data: Functions to export data from the botex and oTree databases.
Command Line Interface
The botex
command line interface provides the option to set up and run oTree experiments with bots from the command line. It also allows the user to export data from both the botex and the oTree database.
botex
Run an oTree session with botex. All necessary arguments can be provided either
as command line arguments, in the environment file referenced by the -c
argument, or as environment variables.
Usage:
Options:
-c, --config TEXT Path to the environment file containing the
botex configuration. Defaults to 'botex.env'.
-i, --ignore Ignore any environment variables and config
files.
-b, --botex-db TEXT Path to the botex SQLite database file (will be
created if it does not exist). Read from
environment variable BOTEX_DB if not provided.
Defaults to 'botex.sqlite3'.
-u, --otree-server-url TEXT oTree server URL. Read from environment
variable OTREE_SERVER_URL if not provided.
Defaults to 'http://localhost:8000'.
-r, --otree-rest-key TEXT oTree secret key for its REST API. Read from
environment variable OTREE_REST_KEY if not
provided. Only required if the oTree server is
running in DEMO or STUDY mode.
-m, --model TEXT Path to the LLM model to use for botex. Read
from environment variable LLM_MODEL if not
provided. If environment variable is not set,
you will be prompted for the model.
-k, --api-key TEXT API key for the LLM model. Read from
environment variable API_KEY if not provided.
If environment variable is not set, you will be
prompted to enter the key.
-a, --api-base TEXT Base URL for the LLM model. If not provided it
will default to None for LiteLLM and
http://localhost:8080 for llama.cpp.
--llamacpp-server TEXT Path to the llama.cpp server executable.
Required if the model is 'llamacpp'. Read from
environment variable LLAMACPP_SERVER_PATH if
not provided.
--llamacpp-local-llm TEXT Path to the local llama.cpp model. Required if
the model is 'llamacpp'. Read from environment
variable LLAMACPP_LOCAL_LLM_PATH if not
provided.
-s, --session-config TEXT oTree session config to run. If not provided,
and also not set by the environment variable
OTREE_SESSION_CONFIG, you will be prompted to
select from the available session
configurations.
-p, --nparticipants INTEGER Number of participants to run the session with.
Read from environment variable
OTREE_NPARTICIPANTS if not provided.
-n, --nhumans INTEGER Number of human participants to include in the
session. Read from environment variable
OTREE_NHUMANS if not provided.
-e, --export-csv-file TEXT CSV file to export botex data to. If not
provided, you will be prompted to enter a file
name after the session is complete.
-x, --no-throttle Disables throttling requests to deal with rate
limiting. Defaults to False. If set to True,
you might run into rate limiting resulting in
failed bot runs.
-v, --verbose Print out botex logs while running. Defaults to
False.
--help Show this message and exit.
Python API: Setup
The botex configuration can be provided via function parameters or by setting environment variables. The latter is useful for secrets like API keys and also makes handling the API easier if you run repeated experiments. This can be facilitated by calling the function load_botex_env()
that reads an .env
file (botex.env
by default). For users who want to use local LLMs for inference, botex
can also start and/or stop a local llama.cpp instance.
load_botex_env
Load botex environment variables from a file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
env_file
|
str
|
The path to the .env file containing the botex configuration. Defaults to "botex.env". |
'botex.env'
|
Returns:
Name | Type | Description |
---|---|---|
Bool |
bool
|
True if at least one botex environment variable was set. |
Source code in src/botex/env.py
start_llamacpp_server
Starts a llama.cpp server instance.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
dict | None
|
A dict containing to the least
the keys |
None
|
Returns:
Type | Description |
---|---|
Popen
|
The process of the running llama.cpp sever if start was successful. |
Raises:
Type | Description |
---|---|
Exception
|
If the server is already running or if starting the server fails. |
Additional details
You can provide other configuration parameters for the local model in the model configuration dictionary. These include:
- `server_url` (str): The base URL for the llama.cpp
server, defaults to `"http://localhost:8080"`.
- `context_length` (int): The context length for the model.
If `None`, BotEx will try to get the context length from the
local model metadata; if that is not possible, it defaults
to `4096`.
- `number_of_layers_to_offload_to_gpu` (int): The number of
layers to offload to the GPU, defaults to `0`.
- ` temperature` (float): The temperature for the model,
defaults to `0.5`.
- `maximum_tokens_to_predict` (int): The maximum number of
tokens to predict, defaults to `10000`.
- `top_p` (float): The top-p value for the model,
defaults to `0.9`.
- `top_k` (int): The top-k value for the model,
defaults to `40`.
- `num_slots` (int): The number of slots for the model,
defaults to `1`.
For all these keys, if not provided in the configuration dictionary, botex will try to get the value from environment variables (in all capital letters, prefixed by LLAMACPP_); if that is not possible, it will use the default value.
Example
from botex.llamacpp import start_llamacpp_server
config = {
"server_path": "/path/to/llama.cpp",
"local_llm_path": "/path/to/local/model",
"server_url": "http://localhost:8080",
"context_length": 4096,
"number_of_layers_to_offload_to_gpu": 0,
"temperature": 0.8,
"maximum_tokens_to_predict": 10000,
"top_p": 0.9,
"top_k": 40,
"num_slots": 1
}
Source code in src/botex/llamacpp.py
288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 |
|
stop_llamacpp_server
Stops a running llama.cpp server instance.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
process
|
Popen
|
The process of the running llama.cpp server. |
required |
Returns:
Type | Description |
---|---|
None
|
None (stops the running llama.cpp server) |
Source code in src/botex/llamacpp.py
Python API: oTree Interface
Running experiments with botex requires an oTree server with an active session to be accessible. The following functions allow the user to interact with the oTree server, such as starting/stopping the server, reading session config data from it, and initializing sessions. Once a session is initialized, the core functions run_bots_on_session()
and run_single_bot()
can be used to run bots on the session.
start_otree_server
Start an oTree server in a subprocess.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
project_path
|
str
|
Path to your oTree project folder. If None (the default), it will be obtained from the environment variable OTREE_PROJECT_PATH. |
None
|
port
|
int
|
The port to run the server on. If None (the default), it will first try to read from the environment variable OTREE_PORT. It that is not set. it will default to 8000. |
None
|
log_file
|
str
|
Path to the log file. If None (the default), it will first be tried to read from the environment variable OTREE_LOG_FILE. If that is not set, it will default to 'otree.log'. |
None
|
auth_level
|
str
|
The authentication level for the oTree server. It is set by environment variable OTREE_AUTH_LEVEL. The default is None, which will leave this environment variable unchanged. if you use 'DEMO' or 'STUDY', the environment variable will be set accordingly and you need to provide a rest key in the argument 'rest_key' below. |
None
|
rest_key
|
str
|
The API key for the oTree server. If None (the default), it will be obtained from the environment variable OTREE_REST_KEY. |
None
|
admin_password
|
str
|
The admin password for the oTree server.
For this to work, |
None
|
timeout
|
int
|
Timeout in seconds to wait for the server. Defaults to 5. |
5
|
Returns:
Type | Description |
---|---|
Popen
|
A subprocess object. |
Raises:
Type | Description |
---|---|
Exception
|
If the oTree server does not start within the timeout. |
Source code in src/botex/otree.py
134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 |
|
stop_otree_server
Stop an oTree server subprocess.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
otree_server
|
subprocess
|
The subprocess object to be terminated. |
required |
Returns:
Type | Description |
---|---|
int
|
The return code of the oTree subprocess |
Source code in src/botex/otree.py
otree_server_is_running
Check if an oTree server is running.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
server_url
|
str
|
The URL of the oTree server. Read from environment variable OTREE_SERVER_URL if None (the default). |
None
|
rest_key
|
str
|
The API key for the oTree server. Read from environment variable OTREE_REST_KEY if None (the default). |
None
|
Returns:
Type | Description |
---|---|
bool
|
True if the server is running, False otherwise. |
Source code in src/botex/otree.py
get_session_configs
Get the session configurations from an oTree server.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
otree_server_url
|
str
|
The URL of the oTree server. Read from environment variable OTREE_SERVER_URL if None (the default). |
None
|
otree_rest_key
|
str
|
The API key for the oTree server. Read from environment variable OTREE_REST_KEY if None (the default). |
None
|
Returns:
Type | Description |
---|---|
dict
|
The session configurations. |
Source code in src/botex/otree.py
init_otree_session
Initialize an oTree session with a given number of participants.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config_name
|
str
|
The name of the oTree session configuration. |
required |
npart
|
int
|
The total number of participants. |
required |
nhumans
|
int
|
The number of human participants (defaults to zero. Provide either nhumans or is_human, but not both. |
0
|
is_human
|
list
|
A list of booleans indicating whether each participant is human. Needs to be the same length as npart. If None (the default), humans (if present) will be randomly assigned. |
None
|
room_name
|
str
|
The name of the oTree room for the session. If None (the default), no room will be used. |
None
|
botex_db
|
str
|
The name of the SQLite database file to store BotEx |
None
|
otree_server_url
|
str
|
The URL of the oTree server. If None (the default), it will be obtained from the environment variable OTREE_SERVER_URL. |
None
|
otree_rest_key
|
str
|
The API key for the oTree server. If None (the default), it will be obtained from the environment variable OTREE_REST_KEY. |
None
|
modified_session_config_fields
|
dict
|
A dictionary of fields to modify in the the oTree session config. Default is None. |
None
|
Returns:
Type | Description |
---|---|
dict
|
dict with the keys 'session_id', 'participant_code', 'is_human', 'bot_urls', and 'human_urls' containing the session ID, participant codes, human indicators, and the URLs for the human and bot participants. |
Source code in src/botex/otree.py
277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 |
|
get_bot_urls
Get the URLs for the bot participants in an oTree session.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
session_id
|
str
|
The ID of the oTree session. |
required |
botex_db
|
str
|
The name of the SQLite database file to store BotEx data. If None (the default), it will be obtained from the environment variable BOTEX_DB. |
None
|
already_started
|
bool
|
If True, the function will also run bots that have already started but not yet finished. This is useful if bots did not startup properly because of network issues. Default is False. |
False
|
Returns:
Type | Description |
---|---|
List[str]
|
List of URLs for the bot participants |
Source code in src/botex/otree.py
run_bots_on_session
Run botex bots on an oTree session.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
session_id
|
str
|
The ID of the oTree session. |
required |
bot_urls
|
list
|
A list of URLs for the bot participants. Will be retrieved from the database if None (the default). |
None
|
botex_db
|
str
|
The name of the SQLite database file for BotEx data. If None (the default), it will be obtained from the environment variable BOTEX_DB. |
None
|
model
|
str
|
The model to use for the bot. Default is
If you want to use local models, we suggest that you use llama.cpp,
In this case, set this string to |
'gpt-4o-2024-08-06'
|
api_key
|
str
|
The API key for the model that you use. If None (the default), it will be obtained from environment variables by liteLLM (e.g., OPENAI_API_KEY or GEMINI_API_KEY). |
None
|
api_base
|
str
|
The base URL for the llm server. Default is None not to
interfere with the default LiteLLM behavior. If you want to use a
local model with llama.cpp and if you have not explicitly set this
parameter, it will default to |
None
|
throttle
|
bool
|
Whether to slow down the bot's requests. Slowing done
the requests can help to avoid rate limiting. Default is False. The
bot will switch to |
False
|
full_conv_history
|
bool
|
Whether to keep the full conversation history. This will increase token use and only work with very short experiments. Default is False. |
False
|
user_prompts
|
dict
|
A dictionary of user prompts to override the default prompts that the bot uses. The keys should be one or more of the following: [ If a key is not present in the dictionary, the default prompt will be used. If a key that is not in the default prompts is present in the dictionary, then the bot will exit with a warning and not run to make sure that the user is aware of the issue. |
None
|
already_started
|
bool
|
If True, the function will also run bots that have already started but not yet finished. This is useful if bots did not startup properly because of network issues. Default is False. |
False
|
wait
|
bool
|
If True (the default), the function will wait for the bots to finish. |
True
|
kwargs
|
dict
|
Additional keyword arguments to pass on to
|
{}
|
Returns:
Type | Description |
---|---|
None | List[Thread]
|
None (bot conversation logs are stored in database) if wait is True. A list of Threads running the bots if wait is False. |
Additional details
When running local models via llama.cpp, if you would like
botex to start the llama.cpp server for you,
run start_llamacpp_server()
to start up the server prior to
running run_bots_on_session()
.
Example Usage
- Running botex with the default model (
gpt-4o-2024-08-06
)
run_bots_on_session(
session_id="your_session_id",
botex_db="path/to/botex.sqlite3",
api_key="your_openai_api_key",
# Other parameters if and as needed
)
- Using a specific model supported by LiteLLM
run_bots_on_session(
session_id="your_session_id",
botex_db="path/to/botex.sqlite3",
model="gemini/gemini-1.5-flash",
api_key="your_gemini_api_key",
# Other parameters if and as needed
)
- Using a local model with BotEx starting the llama.cpp server
llamacpp_config = {
"server_path": "/path/to/llama/server",
"local_llm_path": "/path/to/local/model",
# Additional configuration parameters if and as needed
}
process_id = start_llamacpp_server(llamacpp_config)
run_bots_on_session(
session_id="your_session_id",
botex_db="path/to/botex.sqlite3",
model="llamacpp",
# Other parameters if and as needed
)
stop_llamacpp_server(process_id)
- Using a local model with an already running llama.cpp server that
uses a URL different from the default (if you are using the
default "http://localhost:8080", you can simply omit the
api_base
parameter)
Source code in src/botex/otree.py
424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 |
|
run_single_bot
Runs a single botex bot manually.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
url
|
str
|
The participant URL to start the bot on. |
required |
session_name
|
str
|
The name of the oTree session. Defaults to "unknown" |
'unknown'
|
session_id
|
str
|
The oTree ID of the oTree session. Defaults to "unknown". |
'unknown'
|
participant_id
|
str
|
The oTree ID of the participant. Defaults to "unknown". |
'unknown'
|
botex_db
|
str
|
The name of the SQLite database file to store botex data. |
None
|
full_conv_history
|
bool
|
Whether to keep the full conversation history. This will increase token use and only work with very short experiments. Default is False. |
False
|
model
|
str
|
The model to use for the bot. Default is
If you want to use local models, we suggest that you use llama.cpp,
In this case, set this string to |
'gpt-4o-2024-08-06'
|
api_key
|
str
|
The API key for the model that you use. If None (the default), it will be obtained from environment variables by liteLLM (e.g., OPENAI_API_KEY or GEMINI_API_KEY). |
None
|
api_base
|
str
|
The base URL for the llm server. Default is None not to
interfere with the default LiteLLM behavior. If you want to use a
local model with llama.cpp and if you have not explicitly set this
parameter, it will default to |
None
|
throttle
|
bool
|
Whether to slow down the bot's requests. Slowing done
the requests can help to avoid rate limiting. Default is False. The
bot will switch to |
False
|
full_conv_history
|
bool
|
Whether to keep the full conversation history. This will increase token use and only work with very short experiments. Default is False. |
False
|
user_prompts
|
dict
|
A dictionary of user prompts to override the default prompts that the bot uses. The keys should be one or more of the following: [ If a key is not present in the dictionary, the default prompt will be used. If a key that is not in the default prompts is present in the dictionary, then the bot will exit with a warning and not run to make sure that the user is aware of the issue. |
None
|
wait
|
bool
|
If True (the default), the function will wait for the bots to finish. |
True
|
kwargs
|
dict
|
Additional keyword arguments to pass on to
|
{}
|
Returns:
Type | Description |
---|---|
None | Thread
|
None (conversation is stored in the botex database) if wait is True. The Thread running the bot if wait is False. |
Notes:
- When running local models via llama.cpp, if you would like
botex to start the llama.cpp server for you,
run
start_llamacpp_server()
to start up the server prior to runningrun_bots_on_session()
.
Example Usage
- Using a model via LiteLLM
run_single_bot(
url="your_participant_url",
session_name="your_session_name",
session_id="your_session_id",
participant_id="your_participant_id",
botex_db="path/to/botex.sqlite3",
model="a LiteLLM model string, e.g. 'gemini/gemini-1.5-flash'",
api_key="the API key for your model provide",
# Other parameters if and as needed
)
- Using a local model with an already running llama.cpp server
run_single_bot(
url="your_participant_url",
session_name="your_session_name",
session_id="your_session_id",
participant_id="your_participant_id",
botex_db="path/to/botex.sqlite3",
model="llamacpp",
api_base="http://yourhost:port" # defaults to http://localhost:8080
# Other parameters if and as needed
)
- Using a local model with BotEx starting the llama.cpp server
llamacpp_config = {
"server_path": "/path/to/llama/server",
"local_llm_path": "/path/to/local/model",
# Additional configuration parameters if and as needed
}
process_id = start_llamacpp_server(llamacpp_config)
run_single_bot(
url="your_participant_url",
session_name="your_session_name",
session_id="your_session_id",
participant_id="your_participant_id",
botex_db="path/to/botex.sqlite3",
model="llamacpp",
# Other parameters if and as needed
)
stop_llamacpp_server(process_id)
Source code in src/botex/otree.py
597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 |
|
Python API: Export data
Running oTree experiments with botex generates two databases:
- The 'normal' experiment data that oTree collects.
- Additional data that botex collects, such as the prompting sequence between botex and the bots, as well as the answers and the reasoning behind the answers that the LLM bots provide.
botex provides functions to export these data from the botex and oTree databases. In addition, the function normalize_otree_data()
can be used to re-organize the wide multi-app oTree data into a normalized format that is more convenient for downstream use.
read_participants_from_botex_db
Read the participants table from the botex database and return it as a list of dicts.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
session_id
|
str
|
A session ID to filter the results. |
None
|
botex_db
|
str
|
The name of a SQLite database file. If not provided, it will try to read the file name from the environment variable BOTEX_DB. |
None
|
Returns:
Type | Description |
---|---|
List[Dict]
|
A list of dictionaries with participant data. |
Source code in src/botex/botex_db.py
read_conversations_from_botex_db
Reads the conversations table from the botex database. The conversation table contains the messages exchanged with the LLM underlying the bot. Each conversation is returned as a dictionary containing a JSON string with the message sequence.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
participant_id
|
str
|
A Participant ID to filter the results. |
None
|
botex_db
|
str
|
The name of a SQLite database file. If not provided, it will try to read the file name from the environment variable BOTEX_DB. |
None
|
session_id
|
str
|
A session ID to filter the results. |
None
|
Returns:
Type | Description |
---|---|
List[Dict]
|
A list of dictionaries with the conversation data. |
Source code in src/botex/botex_db.py
read_responses_from_botex_db
Extracts the responses and their rationales from the botex conversation data and returns them as a list of dicts.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
botex_db
|
str
|
The name of a SQLite database file. If not provided, it will try to read the file name from the environment variable BOTEX_DB. |
None
|
session_id
|
str
|
A session ID to filter the results. |
None
|
Returns:
Type | Description |
---|---|
List[Dict]
|
A list of dictionaries with the rationale data. |
Source code in src/botex/botex_db.py
export_participant_data
Export the participants table from the botex database, retrieved by calling
read_participants_from_botex_db()
, to a CSV file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
csv_file
|
str
|
The file path to save the CSV file. |
required |
botex_db
|
str
|
The file path to the botex sqlite3 file. If not provided, it will try to read the file name from the environment variable BOTEX_DB. |
None
|
session_id
|
str
|
A session ID to filter the results. |
None
|
Returns:
Type | Description |
---|---|
None
|
None (saves the CSV to the specified file path) |
Source code in src/botex/botex_db.py
export_response_data
Export the responses parsed from the bot conversations in the botex
database, retrieved by calling read_responses_from_botex_db()
,
to a CSV file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
csv_file
|
str
|
The file path to save the CSV file. |
required |
botex_db
|
str
|
The file path to the botex sqlite3 file. If not provided, it will try to read the file name from the environment variable BOTEX_DB. |
None
|
session_id
|
str
|
A session ID to filter the results. |
None
|
Returns:
Type | Description |
---|---|
None
|
None (saves the CSV to the specified file path) |
Source code in src/botex/botex_db.py
export_otree_data
Export wide data from an oTree server.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
csv_file
|
str
|
Path to the CSV file where the data should be stored. |
required |
server_url
|
str
|
URL of the oTree server. If None (the default), the function will try to use the oTree server URL from the environment variable OTREE_SERVER_URL. |
None
|
admin_name
|
str
|
Admin username. Defaults to "admin". |
'admin'
|
admin_password
|
str
|
Admin password. If None (the default), the function will try to use the oTree admin password from the environment variable OTREE_ADMIN_PASSWORD. |
None
|
time_out
|
int
|
Timeout in seconds to wait for the download. Defaults to 10. |
10
|
Raises:
Type | Description |
---|---|
Exception
|
If the download does not succeed within the timeout. |
Returns None (data is stored in the CSV file).
Detail
The function uses Selenium and a headless Chrome browser to download the CSV file. Ideally, it would use an oTree API endpoint instead.
Source code in src/botex/otree.py
785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 |
|
normalize_otree_data
Normalize oTree data from wide to long format, then reshape it into a set of list-of-dicts structures. Optionally save it to a set of CSV files.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
otree_csv_file
|
str
|
Path to a wide multi app oTree CSV file. |
required |
var_dict
|
dict
|
A dict to customize the exported data. See detail section. |
{'participant': {'code': 'participant_code', 'time_started_utc': 'time_started_utc', '_current_app_name': 'current_app', '_current_page_name': 'current_page'}, 'session': {'code': 'session_code'}}
|
store_as_csv
|
bool
|
Whether to store the normalized data as CSV files. Defaults to False. |
False
|
data_exp_path
|
str
|
Path to the folder where the normalized CSV files should be stored. Defaults to '.' (current folder). |
'.'
|
exp_prefix
|
str
|
Prefix to be used for the CSV file names. Defaults to '' (no prefix). |
''
|
Returns:
Type | Description |
---|---|
dict
|
A dict whose keys are table names (e.g. 'session', 'participant', 'myapp_group', 'myapp_player', etc.) and whose values are lists of dictionaries (i.e., row data). |
Additional details
The var_dict parameter is a dictionary that allows to customize the exported data. The keys of the dictionary are the names of the oTree apps. The values are dictionaries that map the original column names to the desired column names. The keys of these inner dictionaries are the original column names and the values are the desired column names. All variables that are not included in the dict are omitted from the output. The 'participant' and 'session' keys are reserved for the participant and session data, respectively.
Source code in src/botex/otree.py
870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 |
|