Azure Data Factory uses linked services to connect resources, datasets to define data structures, and pipelines containing activities to perform tasks on data. Key concepts include linked services to store connection strings, datasets that point to input/output data, data flows for visual data transformations without code, activities that take datasets as input/output, pipelines that group/manage activities, and triggers that determine when pipelines execute.